0s autopkgtest [00:22:01]: starting date and time: 2024-11-24 00:22:01+0000 0s autopkgtest [00:22:01]: git checkout: 6f3be7a8 Fix armhf LXD image generation for plucky 0s autopkgtest [00:22:01]: host juju-7f2275-prod-proposed-migration-environment-20; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.zejr9je1/out --timeout-copy=6000 -a i386 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:pyreadstat,src:python3-defaults --apt-upgrade pandas --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 '--env=ADT_TEST_TRIGGERS=pyreadstat/1.2.8-1 python3-defaults/3.12.7-1' -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor builder-cpu2-ram4-disk20 --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-20@bos03-13.secgroup --name adt-plucky-i386-pandas-20241123-203106-juju-7f2275-prod-proposed-migration-environment-20-cde5351b-7499-48e1-9397-9f9ae29b1add --image adt/ubuntu-plucky-amd64-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-20 --net-id=net_prod-proposed-migration-amd64 -e TERM=linux -e ''"'"'http_proxy=http://squid.internal:3128'"'"'' -e ''"'"'https_proxy=http://squid.internal:3128'"'"'' -e ''"'"'no_proxy=127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,10.24.0.0/24,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com'"'"'' --mirror=http://ftpmaster.internal/ubuntu/ 42s autopkgtest [00:22:43]: testbed dpkg architecture: amd64 43s autopkgtest [00:22:44]: testbed apt version: 2.9.8 43s autopkgtest [00:22:44]: test architecture: i386 43s autopkgtest [00:22:44]: @@@@@@@@@@@@@@@@@@@@ test bed setup 44s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed InRelease [73.9 kB] 44s Get:2 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse Sources [13.6 kB] 44s Get:3 http://ftpmaster.internal/ubuntu plucky-proposed/main Sources [49.1 kB] 44s Get:4 http://ftpmaster.internal/ubuntu plucky-proposed/universe Sources [892 kB] 44s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/restricted Sources [9704 B] 44s Get:6 http://ftpmaster.internal/ubuntu plucky-proposed/main amd64 Packages [78.8 kB] 44s Get:7 http://ftpmaster.internal/ubuntu plucky-proposed/main i386 Packages [39.0 kB] 44s Get:8 http://ftpmaster.internal/ubuntu plucky-proposed/restricted amd64 Packages [40.1 kB] 44s Get:9 http://ftpmaster.internal/ubuntu plucky-proposed/restricted i386 Packages [2408 B] 44s Get:10 http://ftpmaster.internal/ubuntu plucky-proposed/universe amd64 Packages [758 kB] 45s Get:11 http://ftpmaster.internal/ubuntu plucky-proposed/universe i386 Packages [267 kB] 45s Get:12 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse i386 Packages [5528 B] 45s Get:13 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse amd64 Packages [20.0 kB] 45s Fetched 2248 kB in 1s (2788 kB/s) 45s Reading package lists... 47s Reading package lists... 47s Building dependency tree... 47s Reading state information... 47s Calculating upgrade... 47s The following package was automatically installed and is no longer required: 47s libsgutils2-1.46-2 47s Use 'sudo apt autoremove' to remove it. 47s The following NEW packages will be installed: 47s libsgutils2-1.48 47s The following packages will be upgraded: 47s amd64-microcode bash bpftrace curl debconf debconf-i18n distro-info 47s dracut-install fwupd-signed gir1.2-girepository-2.0 gir1.2-glib-2.0 hostname 47s init init-system-helpers intel-microcode libaudit-common libaudit1 47s libcurl3t64-gnutls libcurl4t64 libgirepository-1.0-1 libglib2.0-0t64 47s libglib2.0-data liblzma5 libpam-modules libpam-modules-bin libpam-runtime 47s libpam0g libplymouth5 libpython3-stdlib libselinux1 libsemanage-common 47s libsemanage2 linux-base lxd-installer openssh-client openssh-server 47s openssh-sftp-server pinentry-curses plymouth plymouth-theme-ubuntu-text 47s python3 python3-blinker python3-dbus python3-debconf python3-gi 47s python3-jsonschema-specifications python3-minimal python3-rpds-py 47s python3-yaml sg3-utils sg3-utils-udev vim-common vim-tiny xxd xz-utils 47s 55 upgraded, 1 newly installed, 0 to remove and 0 not upgraded. 47s Need to get 20.4 MB of archives. 47s After this operation, 4555 kB of additional disk space will be used. 47s Get:1 http://ftpmaster.internal/ubuntu plucky/main amd64 bash amd64 5.2.32-1ubuntu2 [918 kB] 49s Get:2 http://ftpmaster.internal/ubuntu plucky/main amd64 hostname amd64 3.25 [11.1 kB] 49s Get:3 http://ftpmaster.internal/ubuntu plucky/main amd64 init-system-helpers all 1.67ubuntu1 [39.1 kB] 49s Get:4 http://ftpmaster.internal/ubuntu plucky/main amd64 libaudit-common all 1:4.0.2-2ubuntu1 [6578 B] 49s Get:5 http://ftpmaster.internal/ubuntu plucky/main amd64 libaudit1 amd64 1:4.0.2-2ubuntu1 [53.9 kB] 49s Get:6 http://ftpmaster.internal/ubuntu plucky/main amd64 debconf-i18n all 1.5.87ubuntu1 [204 kB] 49s Get:7 http://ftpmaster.internal/ubuntu plucky-proposed/main amd64 python3-minimal amd64 3.12.7-1 [27.4 kB] 49s Get:8 http://ftpmaster.internal/ubuntu plucky-proposed/main amd64 python3 amd64 3.12.7-1 [24.0 kB] 49s Get:9 http://ftpmaster.internal/ubuntu plucky-proposed/main amd64 libpython3-stdlib amd64 3.12.7-1 [10.0 kB] 49s Get:10 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-debconf all 1.5.87ubuntu1 [4156 B] 49s Get:11 http://ftpmaster.internal/ubuntu plucky/main amd64 debconf all 1.5.87ubuntu1 [124 kB] 49s Get:12 http://ftpmaster.internal/ubuntu plucky/main amd64 libpam0g amd64 1.5.3-7ubuntu4 [69.6 kB] 49s Get:13 http://ftpmaster.internal/ubuntu plucky/main amd64 libselinux1 amd64 3.7-3ubuntu1 [86.9 kB] 49s Get:14 http://ftpmaster.internal/ubuntu plucky/main amd64 libpam-modules-bin amd64 1.5.3-7ubuntu4 [53.7 kB] 49s Get:15 http://ftpmaster.internal/ubuntu plucky/main amd64 libpam-modules amd64 1.5.3-7ubuntu4 [294 kB] 49s Get:16 http://ftpmaster.internal/ubuntu plucky/main amd64 init amd64 1.67ubuntu1 [6428 B] 49s Get:17 http://ftpmaster.internal/ubuntu plucky/main amd64 openssh-sftp-server amd64 1:9.9p1-3ubuntu2 [41.2 kB] 49s Get:18 http://ftpmaster.internal/ubuntu plucky/main amd64 openssh-server amd64 1:9.9p1-3ubuntu2 [625 kB] 49s Get:19 http://ftpmaster.internal/ubuntu plucky/main amd64 openssh-client amd64 1:9.9p1-3ubuntu2 [1080 kB] 49s Get:20 http://ftpmaster.internal/ubuntu plucky/main amd64 libpam-runtime all 1.5.3-7ubuntu4 [40.8 kB] 49s Get:21 http://ftpmaster.internal/ubuntu plucky/main amd64 liblzma5 amd64 5.6.3-1 [156 kB] 49s Get:22 http://ftpmaster.internal/ubuntu plucky/main amd64 libsemanage-common all 3.7-2build1 [7186 B] 49s Get:23 http://ftpmaster.internal/ubuntu plucky/main amd64 libsemanage2 amd64 3.7-2build1 [105 kB] 49s Get:24 http://ftpmaster.internal/ubuntu plucky/main amd64 distro-info amd64 1.12 [20.0 kB] 49s Get:25 http://ftpmaster.internal/ubuntu plucky/main amd64 gir1.2-girepository-2.0 amd64 1.82.0-2 [25.3 kB] 49s Get:26 http://ftpmaster.internal/ubuntu plucky/main amd64 gir1.2-glib-2.0 amd64 2.82.2-3 [182 kB] 49s Get:27 http://ftpmaster.internal/ubuntu plucky/main amd64 libglib2.0-0t64 amd64 2.82.2-3 [1655 kB] 49s Get:28 http://ftpmaster.internal/ubuntu plucky/main amd64 libgirepository-1.0-1 amd64 1.82.0-2 [88.7 kB] 49s Get:29 http://ftpmaster.internal/ubuntu plucky/main amd64 libglib2.0-data all 2.82.2-3 [51.7 kB] 49s Get:30 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-dbus amd64 1.3.2-5build4 [110 kB] 49s Get:31 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-gi amd64 3.50.0-3build1 [293 kB] 49s Get:32 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-yaml amd64 6.0.2-1build1 [187 kB] 49s Get:33 http://ftpmaster.internal/ubuntu plucky/main amd64 vim-tiny amd64 2:9.1.0861-1ubuntu1 [1037 kB] 49s Get:34 http://ftpmaster.internal/ubuntu plucky/main amd64 vim-common all 2:9.1.0861-1ubuntu1 [395 kB] 49s Get:35 http://ftpmaster.internal/ubuntu plucky/main amd64 xxd amd64 2:9.1.0861-1ubuntu1 [67.8 kB] 49s Get:36 http://ftpmaster.internal/ubuntu plucky/main amd64 libplymouth5 amd64 24.004.60-2ubuntu3 [145 kB] 49s Get:37 http://ftpmaster.internal/ubuntu plucky/main amd64 plymouth-theme-ubuntu-text amd64 24.004.60-2ubuntu3 [10.3 kB] 49s Get:38 http://ftpmaster.internal/ubuntu plucky/main amd64 plymouth amd64 24.004.60-2ubuntu3 [140 kB] 49s Get:39 http://ftpmaster.internal/ubuntu plucky/main amd64 xz-utils amd64 5.6.3-1 [276 kB] 49s Get:40 http://ftpmaster.internal/ubuntu plucky/main amd64 bpftrace amd64 0.21.2-2ubuntu3 [1787 kB] 49s Get:41 http://ftpmaster.internal/ubuntu plucky/main amd64 curl amd64 8.9.1-2ubuntu3 [243 kB] 49s Get:42 http://ftpmaster.internal/ubuntu plucky/main amd64 libcurl4t64 amd64 8.9.1-2ubuntu3 [420 kB] 49s Get:43 http://ftpmaster.internal/ubuntu plucky/main amd64 dracut-install amd64 105-2ubuntu2 [35.9 kB] 49s Get:44 http://ftpmaster.internal/ubuntu plucky/main amd64 fwupd-signed amd64 1.55+1.7-1 [30.6 kB] 49s Get:45 http://ftpmaster.internal/ubuntu plucky/main amd64 libcurl3t64-gnutls amd64 8.9.1-2ubuntu3 [412 kB] 49s Get:46 http://ftpmaster.internal/ubuntu plucky/main amd64 libsgutils2-1.48 amd64 1.48-0ubuntu1 [124 kB] 49s Get:47 http://ftpmaster.internal/ubuntu plucky/main amd64 linux-base all 4.10.1ubuntu1 [34.8 kB] 49s Get:48 http://ftpmaster.internal/ubuntu plucky/main amd64 lxd-installer all 10 [5264 B] 49s Get:49 http://ftpmaster.internal/ubuntu plucky/main amd64 pinentry-curses amd64 1.3.1-0ubuntu2 [41.6 kB] 49s Get:50 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-blinker all 1.9.0-1 [10.7 kB] 49s Get:51 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-rpds-py amd64 0.21.0-2ubuntu1 [323 kB] 49s Get:52 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-jsonschema-specifications all 2023.12.1-2 [9116 B] 49s Get:53 http://ftpmaster.internal/ubuntu plucky/main amd64 sg3-utils amd64 1.48-0ubuntu1 [1042 kB] 49s Get:54 http://ftpmaster.internal/ubuntu plucky/main amd64 sg3-utils-udev all 1.48-0ubuntu1 [6608 B] 49s Get:55 http://ftpmaster.internal/ubuntu plucky/main amd64 amd64-microcode amd64 3.20240820.1ubuntu1 [187 kB] 49s Get:56 http://ftpmaster.internal/ubuntu plucky/main amd64 intel-microcode amd64 3.20241112.1ubuntu2 [7055 kB] 49s Preconfiguring packages ... 49s Fetched 20.4 MB in 1s (14.0 MB/s) 49s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75620 files and directories currently installed.) 49s Preparing to unpack .../bash_5.2.32-1ubuntu2_amd64.deb ... 49s Unpacking bash (5.2.32-1ubuntu2) over (5.2.32-1ubuntu1) ... 49s Setting up bash (5.2.32-1ubuntu2) ... 49s update-alternatives: using /usr/share/man/man7/bash-builtins.7.gz to provide /usr/share/man/man7/builtins.7.gz (builtins.7.gz) in auto mode 49s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75620 files and directories currently installed.) 49s Preparing to unpack .../hostname_3.25_amd64.deb ... 49s Unpacking hostname (3.25) over (3.23+nmu2ubuntu2) ... 49s Setting up hostname (3.25) ... 49s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75620 files and directories currently installed.) 49s Preparing to unpack .../init-system-helpers_1.67ubuntu1_all.deb ... 49s Unpacking init-system-helpers (1.67ubuntu1) over (1.66ubuntu1) ... 50s Setting up init-system-helpers (1.67ubuntu1) ... 50s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75620 files and directories currently installed.) 50s Preparing to unpack .../libaudit-common_1%3a4.0.2-2ubuntu1_all.deb ... 50s Unpacking libaudit-common (1:4.0.2-2ubuntu1) over (1:4.0.1-1ubuntu2) ... 50s Setting up libaudit-common (1:4.0.2-2ubuntu1) ... 50s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75620 files and directories currently installed.) 50s Preparing to unpack .../libaudit1_1%3a4.0.2-2ubuntu1_amd64.deb ... 50s Unpacking libaudit1:amd64 (1:4.0.2-2ubuntu1) over (1:4.0.1-1ubuntu2) ... 50s Setting up libaudit1:amd64 (1:4.0.2-2ubuntu1) ... 50s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75620 files and directories currently installed.) 50s Preparing to unpack .../debconf-i18n_1.5.87ubuntu1_all.deb ... 50s Unpacking debconf-i18n (1.5.87ubuntu1) over (1.5.86ubuntu1) ... 50s Preparing to unpack .../python3-minimal_3.12.7-1_amd64.deb ... 50s Unpacking python3-minimal (3.12.7-1) over (3.12.6-0ubuntu1) ... 50s Setting up python3-minimal (3.12.7-1) ... 50s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75620 files and directories currently installed.) 50s Preparing to unpack .../python3_3.12.7-1_amd64.deb ... 50s Unpacking python3 (3.12.7-1) over (3.12.6-0ubuntu1) ... 50s Preparing to unpack .../libpython3-stdlib_3.12.7-1_amd64.deb ... 50s Unpacking libpython3-stdlib:amd64 (3.12.7-1) over (3.12.6-0ubuntu1) ... 50s Preparing to unpack .../python3-debconf_1.5.87ubuntu1_all.deb ... 50s Unpacking python3-debconf (1.5.87ubuntu1) over (1.5.86ubuntu1) ... 50s Preparing to unpack .../debconf_1.5.87ubuntu1_all.deb ... 50s Unpacking debconf (1.5.87ubuntu1) over (1.5.86ubuntu1) ... 50s Setting up debconf (1.5.87ubuntu1) ... 50s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75620 files and directories currently installed.) 50s Preparing to unpack .../libpam0g_1.5.3-7ubuntu4_amd64.deb ... 50s Unpacking libpam0g:amd64 (1.5.3-7ubuntu4) over (1.5.3-7ubuntu2) ... 50s Setting up libpam0g:amd64 (1.5.3-7ubuntu4) ... 51s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75620 files and directories currently installed.) 51s Preparing to unpack .../libselinux1_3.7-3ubuntu1_amd64.deb ... 51s Unpacking libselinux1:amd64 (3.7-3ubuntu1) over (3.5-2ubuntu5) ... 51s Setting up libselinux1:amd64 (3.7-3ubuntu1) ... 51s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75620 files and directories currently installed.) 51s Preparing to unpack .../libpam-modules-bin_1.5.3-7ubuntu4_amd64.deb ... 51s Unpacking libpam-modules-bin (1.5.3-7ubuntu4) over (1.5.3-7ubuntu2) ... 51s Setting up libpam-modules-bin (1.5.3-7ubuntu4) ... 51s pam_namespace.service is a disabled or a static unit not running, not starting it. 51s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75620 files and directories currently installed.) 51s Preparing to unpack .../libpam-modules_1.5.3-7ubuntu4_amd64.deb ... 51s Unpacking libpam-modules:amd64 (1.5.3-7ubuntu4) over (1.5.3-7ubuntu2) ... 51s Setting up libpam-modules:amd64 (1.5.3-7ubuntu4) ... 51s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75620 files and directories currently installed.) 51s Preparing to unpack .../init_1.67ubuntu1_amd64.deb ... 51s Unpacking init (1.67ubuntu1) over (1.66ubuntu1) ... 51s Preparing to unpack .../openssh-sftp-server_1%3a9.9p1-3ubuntu2_amd64.deb ... 51s Unpacking openssh-sftp-server (1:9.9p1-3ubuntu2) over (1:9.7p1-7ubuntu5) ... 51s Preparing to unpack .../openssh-server_1%3a9.9p1-3ubuntu2_amd64.deb ... 51s Unpacking openssh-server (1:9.9p1-3ubuntu2) over (1:9.7p1-7ubuntu5) ... 51s Preparing to unpack .../openssh-client_1%3a9.9p1-3ubuntu2_amd64.deb ... 51s Unpacking openssh-client (1:9.9p1-3ubuntu2) over (1:9.7p1-7ubuntu5) ... 51s Preparing to unpack .../libpam-runtime_1.5.3-7ubuntu4_all.deb ... 51s Unpacking libpam-runtime (1.5.3-7ubuntu4) over (1.5.3-7ubuntu2) ... 51s Setting up libpam-runtime (1.5.3-7ubuntu4) ... 52s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75622 files and directories currently installed.) 52s Preparing to unpack .../liblzma5_5.6.3-1_amd64.deb ... 52s Unpacking liblzma5:amd64 (5.6.3-1) over (5.6.2-2) ... 52s Setting up liblzma5:amd64 (5.6.3-1) ... 52s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75622 files and directories currently installed.) 52s Preparing to unpack .../libsemanage-common_3.7-2build1_all.deb ... 52s Unpacking libsemanage-common (3.7-2build1) over (3.5-1build6) ... 52s Setting up libsemanage-common (3.7-2build1) ... 52s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75621 files and directories currently installed.) 52s Preparing to unpack .../libsemanage2_3.7-2build1_amd64.deb ... 52s Unpacking libsemanage2:amd64 (3.7-2build1) over (3.5-1build6) ... 52s Setting up libsemanage2:amd64 (3.7-2build1) ... 52s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75621 files and directories currently installed.) 52s Preparing to unpack .../00-distro-info_1.12_amd64.deb ... 52s Unpacking distro-info (1.12) over (1.9) ... 52s Preparing to unpack .../01-gir1.2-girepository-2.0_1.82.0-2_amd64.deb ... 52s Unpacking gir1.2-girepository-2.0:amd64 (1.82.0-2) over (1.80.1-4) ... 52s Preparing to unpack .../02-gir1.2-glib-2.0_2.82.2-3_amd64.deb ... 52s Unpacking gir1.2-glib-2.0:amd64 (2.82.2-3) over (2.82.1-0ubuntu1) ... 52s Preparing to unpack .../03-libglib2.0-0t64_2.82.2-3_amd64.deb ... 52s Unpacking libglib2.0-0t64:amd64 (2.82.2-3) over (2.82.1-0ubuntu1) ... 52s Preparing to unpack .../04-libgirepository-1.0-1_1.82.0-2_amd64.deb ... 52s Unpacking libgirepository-1.0-1:amd64 (1.82.0-2) over (1.80.1-4) ... 52s Preparing to unpack .../05-libglib2.0-data_2.82.2-3_all.deb ... 52s Unpacking libglib2.0-data (2.82.2-3) over (2.82.1-0ubuntu1) ... 52s Preparing to unpack .../06-python3-dbus_1.3.2-5build4_amd64.deb ... 52s Unpacking python3-dbus (1.3.2-5build4) over (1.3.2-5build3) ... 52s Preparing to unpack .../07-python3-gi_3.50.0-3build1_amd64.deb ... 52s Unpacking python3-gi (3.50.0-3build1) over (3.50.0-3) ... 52s Preparing to unpack .../08-python3-yaml_6.0.2-1build1_amd64.deb ... 52s Unpacking python3-yaml (6.0.2-1build1) over (6.0.2-1) ... 52s Preparing to unpack .../09-vim-tiny_2%3a9.1.0861-1ubuntu1_amd64.deb ... 52s Unpacking vim-tiny (2:9.1.0861-1ubuntu1) over (2:9.1.0777-1ubuntu1) ... 52s Preparing to unpack .../10-vim-common_2%3a9.1.0861-1ubuntu1_all.deb ... 52s Unpacking vim-common (2:9.1.0861-1ubuntu1) over (2:9.1.0777-1ubuntu1) ... 52s Preparing to unpack .../11-xxd_2%3a9.1.0861-1ubuntu1_amd64.deb ... 52s Unpacking xxd (2:9.1.0861-1ubuntu1) over (2:9.1.0777-1ubuntu1) ... 52s Preparing to unpack .../12-libplymouth5_24.004.60-2ubuntu3_amd64.deb ... 52s Unpacking libplymouth5:amd64 (24.004.60-2ubuntu3) over (24.004.60-1ubuntu11) ... 52s Preparing to unpack .../13-plymouth-theme-ubuntu-text_24.004.60-2ubuntu3_amd64.deb ... 52s Unpacking plymouth-theme-ubuntu-text (24.004.60-2ubuntu3) over (24.004.60-1ubuntu11) ... 52s Preparing to unpack .../14-plymouth_24.004.60-2ubuntu3_amd64.deb ... 52s Unpacking plymouth (24.004.60-2ubuntu3) over (24.004.60-1ubuntu11) ... 53s Preparing to unpack .../15-xz-utils_5.6.3-1_amd64.deb ... 53s Unpacking xz-utils (5.6.3-1) over (5.6.2-2) ... 53s Preparing to unpack .../16-bpftrace_0.21.2-2ubuntu3_amd64.deb ... 53s Unpacking bpftrace (0.21.2-2ubuntu3) over (0.21.2-2ubuntu2) ... 53s Preparing to unpack .../17-curl_8.9.1-2ubuntu3_amd64.deb ... 53s Unpacking curl (8.9.1-2ubuntu3) over (8.9.1-2ubuntu2) ... 53s Preparing to unpack .../18-libcurl4t64_8.9.1-2ubuntu3_amd64.deb ... 53s Unpacking libcurl4t64:amd64 (8.9.1-2ubuntu3) over (8.9.1-2ubuntu2) ... 53s Preparing to unpack .../19-dracut-install_105-2ubuntu2_amd64.deb ... 53s Unpacking dracut-install (105-2ubuntu2) over (105-1ubuntu1) ... 53s Preparing to unpack .../20-fwupd-signed_1.55+1.7-1_amd64.deb ... 53s Unpacking fwupd-signed (1.55+1.7-1) over (1.54+1.6-1build1) ... 53s Preparing to unpack .../21-libcurl3t64-gnutls_8.9.1-2ubuntu3_amd64.deb ... 53s Unpacking libcurl3t64-gnutls:amd64 (8.9.1-2ubuntu3) over (8.9.1-2ubuntu2) ... 53s Selecting previously unselected package libsgutils2-1.48:amd64. 53s Preparing to unpack .../22-libsgutils2-1.48_1.48-0ubuntu1_amd64.deb ... 53s Unpacking libsgutils2-1.48:amd64 (1.48-0ubuntu1) ... 53s Preparing to unpack .../23-linux-base_4.10.1ubuntu1_all.deb ... 53s Unpacking linux-base (4.10.1ubuntu1) over (4.5ubuntu9) ... 53s Preparing to unpack .../24-lxd-installer_10_all.deb ... 53s Unpacking lxd-installer (10) over (9) ... 53s Preparing to unpack .../25-pinentry-curses_1.3.1-0ubuntu2_amd64.deb ... 53s Unpacking pinentry-curses (1.3.1-0ubuntu2) over (1.2.1-3ubuntu5) ... 53s Preparing to unpack .../26-python3-blinker_1.9.0-1_all.deb ... 53s Unpacking python3-blinker (1.9.0-1) over (1.8.2-1) ... 54s Preparing to unpack .../27-python3-rpds-py_0.21.0-2ubuntu1_amd64.deb ... 54s Unpacking python3-rpds-py (0.21.0-2ubuntu1) over (0.20.0-0ubuntu3) ... 54s Preparing to unpack .../28-python3-jsonschema-specifications_2023.12.1-2_all.deb ... 54s Unpacking python3-jsonschema-specifications (2023.12.1-2) over (2023.12.1-1ubuntu1) ... 54s Preparing to unpack .../29-sg3-utils_1.48-0ubuntu1_amd64.deb ... 54s Unpacking sg3-utils (1.48-0ubuntu1) over (1.46-3ubuntu5) ... 54s Preparing to unpack .../30-sg3-utils-udev_1.48-0ubuntu1_all.deb ... 54s Unpacking sg3-utils-udev (1.48-0ubuntu1) over (1.46-3ubuntu5) ... 54s Preparing to unpack .../31-amd64-microcode_3.20240820.1ubuntu1_amd64.deb ... 54s Unpacking amd64-microcode (3.20240820.1ubuntu1) over (3.20240116.2+nmu1ubuntu1.1) ... 54s Preparing to unpack .../32-intel-microcode_3.20241112.1ubuntu2_amd64.deb ... 54s Unpacking intel-microcode (3.20241112.1ubuntu2) over (3.20240910.0ubuntu1) ... 54s Setting up pinentry-curses (1.3.1-0ubuntu2) ... 54s Setting up distro-info (1.12) ... 54s Setting up linux-base (4.10.1ubuntu1) ... 54s Setting up init (1.67ubuntu1) ... 54s Setting up libcurl4t64:amd64 (8.9.1-2ubuntu3) ... 54s Setting up bpftrace (0.21.2-2ubuntu3) ... 54s Setting up openssh-client (1:9.9p1-3ubuntu2) ... 54s Setting up intel-microcode (3.20241112.1ubuntu2) ... 54s intel-microcode: microcode will be updated at next boot 54s Setting up libcurl3t64-gnutls:amd64 (8.9.1-2ubuntu3) ... 54s Setting up fwupd-signed (1.55+1.7-1) ... 54s Setting up libsgutils2-1.48:amd64 (1.48-0ubuntu1) ... 54s Setting up debconf-i18n (1.5.87ubuntu1) ... 54s Setting up amd64-microcode (3.20240820.1ubuntu1) ... 54s amd64-microcode: microcode will be updated at next boot 54s Setting up xxd (2:9.1.0861-1ubuntu1) ... 54s Setting up libglib2.0-0t64:amd64 (2.82.2-3) ... 54s No schema files found: doing nothing. 54s Setting up libglib2.0-data (2.82.2-3) ... 54s Setting up vim-common (2:9.1.0861-1ubuntu1) ... 54s Setting up xz-utils (5.6.3-1) ... 54s Setting up gir1.2-glib-2.0:amd64 (2.82.2-3) ... 54s Setting up lxd-installer (10) ... 54s Setting up dracut-install (105-2ubuntu2) ... 54s Setting up libplymouth5:amd64 (24.004.60-2ubuntu3) ... 54s Setting up libgirepository-1.0-1:amd64 (1.82.0-2) ... 54s Setting up curl (8.9.1-2ubuntu3) ... 54s Setting up libpython3-stdlib:amd64 (3.12.7-1) ... 54s Setting up sg3-utils (1.48-0ubuntu1) ... 54s Setting up openssh-sftp-server (1:9.9p1-3ubuntu2) ... 54s Setting up openssh-server (1:9.9p1-3ubuntu2) ... 54s Installing new version of config file /etc/ssh/moduli ... 54s Replacing config file /etc/ssh/sshd_config with new version 55s Setting up plymouth (24.004.60-2ubuntu3) ... 55s update-initramfs: Generating /boot/initrd.img-6.11.0-8-generic 55s W: No lz4 in /usr/bin:/sbin:/bin, using gzip 62s update-rc.d: warning: start and stop actions are no longer supported; falling back to defaults 62s update-rc.d: warning: start and stop actions are no longer supported; falling back to defaults 63s Setting up python3 (3.12.7-1) ... 63s Setting up vim-tiny (2:9.1.0861-1ubuntu1) ... 63s Setting up sg3-utils-udev (1.48-0ubuntu1) ... 63s update-initramfs: deferring update (trigger activated) 63s Setting up plymouth-theme-ubuntu-text (24.004.60-2ubuntu3) ... 63s update-initramfs: deferring update (trigger activated) 63s Setting up gir1.2-girepository-2.0:amd64 (1.82.0-2) ... 63s Setting up python3-gi (3.50.0-3build1) ... 63s Setting up python3-rpds-py (0.21.0-2ubuntu1) ... 63s Setting up python3-jsonschema-specifications (2023.12.1-2) ... 63s Setting up python3-blinker (1.9.0-1) ... 63s Setting up python3-dbus (1.3.2-5build4) ... 63s Setting up python3-debconf (1.5.87ubuntu1) ... 64s Setting up python3-yaml (6.0.2-1build1) ... 64s Processing triggers for man-db (2.13.0-1) ... 65s Processing triggers for debianutils (5.21) ... 65s Processing triggers for install-info (7.1.1-1) ... 65s Processing triggers for initramfs-tools (0.142ubuntu35) ... 65s update-initramfs: Generating /boot/initrd.img-6.11.0-8-generic 65s W: No lz4 in /usr/bin:/sbin:/bin, using gzip 72s Processing triggers for libc-bin (2.40-1ubuntu3) ... 72s Processing triggers for ufw (0.36.2-8) ... 72s Reading package lists... 72s Building dependency tree... 72s Reading state information... 72s The following packages will be REMOVED: 72s libsgutils2-1.46-2* 72s 0 upgraded, 0 newly installed, 1 to remove and 0 not upgraded. 72s After this operation, 294 kB disk space will be freed. 72s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75656 files and directories currently installed.) 72s Removing libsgutils2-1.46-2:amd64 (1.46-3ubuntu5) ... 72s Processing triggers for libc-bin (2.40-1ubuntu3) ... 73s Hit:1 http://ftpmaster.internal/ubuntu plucky-proposed InRelease 73s Hit:2 http://ftpmaster.internal/ubuntu plucky InRelease 73s Hit:3 http://ftpmaster.internal/ubuntu plucky-updates InRelease 73s Hit:4 http://ftpmaster.internal/ubuntu plucky-security InRelease 74s Reading package lists... 74s Reading package lists... 74s Building dependency tree... 74s Reading state information... 74s Calculating upgrade... 74s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 74s Reading package lists... 74s Building dependency tree... 74s Reading state information... 75s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 75s autopkgtest [00:23:16]: rebooting testbed after setup commands that affected boot 91s autopkgtest [00:23:32]: testbed running kernel: Linux 6.11.0-8-generic #8-Ubuntu SMP PREEMPT_DYNAMIC Mon Sep 16 13:41:20 UTC 2024 93s autopkgtest [00:23:34]: @@@@@@@@@@@@@@@@@@@@ apt-source pandas 96s Get:1 http://ftpmaster.internal/ubuntu plucky/universe pandas 2.2.3+dfsg-5ubuntu1 (dsc) [5587 B] 96s Get:2 http://ftpmaster.internal/ubuntu plucky/universe pandas 2.2.3+dfsg-5ubuntu1 (tar) [11.2 MB] 96s Get:3 http://ftpmaster.internal/ubuntu plucky/universe pandas 2.2.3+dfsg-5ubuntu1 (diff) [111 kB] 96s gpgv: Signature made Fri Nov 15 11:32:46 2024 UTC 96s gpgv: using RSA key 25E3FF2D7F469DBE7D0D4E50AFCFEC8E669CE1C2 96s gpgv: Can't check signature: No public key 96s dpkg-source: warning: cannot verify inline signature for ./pandas_2.2.3+dfsg-5ubuntu1.dsc: no acceptable signature found 97s autopkgtest [00:23:38]: testing package pandas version 2.2.3+dfsg-5ubuntu1 98s autopkgtest [00:23:39]: build not needed 100s autopkgtest [00:23:41]: test command1: preparing testbed 102s Note, using file '/tmp/autopkgtest.rcV9Ni/1-autopkgtest-satdep.dsc' to get the build dependencies 102s Reading package lists... 102s Building dependency tree... 102s Reading state information... 102s Starting pkgProblemResolver with broken count: 0 102s Starting 2 pkgProblemResolver with broken count: 0 102s Done 103s The following NEW packages will be installed: 103s build-essential cpp cpp-14 cpp-14-x86-64-linux-gnu cpp-x86-64-linux-gnu g++ 103s g++-14 g++-14-x86-64-linux-gnu g++-x86-64-linux-gnu gcc gcc-14 103s gcc-14-x86-64-linux-gnu gcc-x86-64-linux-gnu libasan8 libblas3 libcc1-0 103s libgcc-14-dev libgfortran5 libgomp1 libhwasan0 libisl23 libitm1 liblapack3 103s liblsan0 libmpc3 libquadmath0 libstdc++-14-dev libtsan2 libubsan1 103s python3-dateutil python3-numpy python3-pandas python3-pandas-lib python3-tz 103s 0 upgraded, 34 newly installed, 0 to remove and 0 not upgraded. 103s Need to get 86.7 MB of archives. 103s After this operation, 348 MB of additional disk space will be used. 103s Get:1 http://ftpmaster.internal/ubuntu plucky/main amd64 libisl23 amd64 0.27-1 [685 kB] 103s Get:2 http://ftpmaster.internal/ubuntu plucky/main amd64 libmpc3 amd64 1.3.1-1build2 [55.3 kB] 103s Get:3 http://ftpmaster.internal/ubuntu plucky/main amd64 cpp-14-x86-64-linux-gnu amd64 14.2.0-8ubuntu1 [11.9 MB] 104s Get:4 http://ftpmaster.internal/ubuntu plucky/main amd64 cpp-14 amd64 14.2.0-8ubuntu1 [1030 B] 104s Get:5 http://ftpmaster.internal/ubuntu plucky/main amd64 cpp-x86-64-linux-gnu amd64 4:14.1.0-2ubuntu1 [5452 B] 104s Get:6 http://ftpmaster.internal/ubuntu plucky/main amd64 cpp amd64 4:14.1.0-2ubuntu1 [22.4 kB] 104s Get:7 http://ftpmaster.internal/ubuntu plucky/main amd64 libcc1-0 amd64 14.2.0-8ubuntu1 [47.6 kB] 104s Get:8 http://ftpmaster.internal/ubuntu plucky/main amd64 libgomp1 amd64 14.2.0-8ubuntu1 [148 kB] 104s Get:9 http://ftpmaster.internal/ubuntu plucky/main amd64 libitm1 amd64 14.2.0-8ubuntu1 [29.1 kB] 104s Get:10 http://ftpmaster.internal/ubuntu plucky/main amd64 libasan8 amd64 14.2.0-8ubuntu1 [2998 kB] 104s Get:11 http://ftpmaster.internal/ubuntu plucky/main amd64 liblsan0 amd64 14.2.0-8ubuntu1 [1317 kB] 104s Get:12 http://ftpmaster.internal/ubuntu plucky/main amd64 libtsan2 amd64 14.2.0-8ubuntu1 [2732 kB] 104s Get:13 http://ftpmaster.internal/ubuntu plucky/main amd64 libubsan1 amd64 14.2.0-8ubuntu1 [1177 kB] 104s Get:14 http://ftpmaster.internal/ubuntu plucky/main amd64 libhwasan0 amd64 14.2.0-8ubuntu1 [1634 kB] 104s Get:15 http://ftpmaster.internal/ubuntu plucky/main amd64 libquadmath0 amd64 14.2.0-8ubuntu1 [153 kB] 104s Get:16 http://ftpmaster.internal/ubuntu plucky/main amd64 libgcc-14-dev amd64 14.2.0-8ubuntu1 [2814 kB] 104s Get:17 http://ftpmaster.internal/ubuntu plucky/main amd64 gcc-14-x86-64-linux-gnu amd64 14.2.0-8ubuntu1 [23.3 MB] 105s Get:18 http://ftpmaster.internal/ubuntu plucky/main amd64 gcc-14 amd64 14.2.0-8ubuntu1 [528 kB] 105s Get:19 http://ftpmaster.internal/ubuntu plucky/main amd64 gcc-x86-64-linux-gnu amd64 4:14.1.0-2ubuntu1 [1214 B] 105s Get:20 http://ftpmaster.internal/ubuntu plucky/main amd64 gcc amd64 4:14.1.0-2ubuntu1 [5000 B] 105s Get:21 http://ftpmaster.internal/ubuntu plucky/main amd64 libstdc++-14-dev amd64 14.2.0-8ubuntu1 [2504 kB] 106s Get:22 http://ftpmaster.internal/ubuntu plucky/main amd64 g++-14-x86-64-linux-gnu amd64 14.2.0-8ubuntu1 [13.3 MB] 106s Get:23 http://ftpmaster.internal/ubuntu plucky/main amd64 g++-14 amd64 14.2.0-8ubuntu1 [19.9 kB] 106s Get:24 http://ftpmaster.internal/ubuntu plucky/main amd64 g++-x86-64-linux-gnu amd64 4:14.1.0-2ubuntu1 [966 B] 106s Get:25 http://ftpmaster.internal/ubuntu plucky/main amd64 g++ amd64 4:14.1.0-2ubuntu1 [1100 B] 106s Get:26 http://ftpmaster.internal/ubuntu plucky/main amd64 build-essential amd64 12.10ubuntu1 [4928 B] 106s Get:27 http://ftpmaster.internal/ubuntu plucky/main amd64 libblas3 amd64 3.12.0-4 [332 kB] 106s Get:28 http://ftpmaster.internal/ubuntu plucky/main amd64 libgfortran5 amd64 14.2.0-8ubuntu1 [909 kB] 106s Get:29 http://ftpmaster.internal/ubuntu plucky/main amd64 liblapack3 amd64 3.12.0-4 [3177 kB] 106s Get:30 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-dateutil all 2.9.0-3 [80.2 kB] 106s Get:31 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-numpy amd64 1:1.26.4+ds-11ubuntu1 [5319 kB] 107s Get:32 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-tz all 2024.1-2 [31.4 kB] 107s Get:33 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pandas-lib amd64 2.2.3+dfsg-5ubuntu1 [8271 kB] 107s Get:34 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pandas all 2.2.3+dfsg-5ubuntu1 [3112 kB] 107s Fetched 86.7 MB in 4s (19.8 MB/s) 107s Selecting previously unselected package libisl23:amd64. 107s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 75651 files and directories currently installed.) 107s Preparing to unpack .../00-libisl23_0.27-1_amd64.deb ... 107s Unpacking libisl23:amd64 (0.27-1) ... 108s Selecting previously unselected package libmpc3:amd64. 108s Preparing to unpack .../01-libmpc3_1.3.1-1build2_amd64.deb ... 108s Unpacking libmpc3:amd64 (1.3.1-1build2) ... 108s Selecting previously unselected package cpp-14-x86-64-linux-gnu. 108s Preparing to unpack .../02-cpp-14-x86-64-linux-gnu_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking cpp-14-x86-64-linux-gnu (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package cpp-14. 108s Preparing to unpack .../03-cpp-14_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking cpp-14 (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package cpp-x86-64-linux-gnu. 108s Preparing to unpack .../04-cpp-x86-64-linux-gnu_4%3a14.1.0-2ubuntu1_amd64.deb ... 108s Unpacking cpp-x86-64-linux-gnu (4:14.1.0-2ubuntu1) ... 108s Selecting previously unselected package cpp. 108s Preparing to unpack .../05-cpp_4%3a14.1.0-2ubuntu1_amd64.deb ... 108s Unpacking cpp (4:14.1.0-2ubuntu1) ... 108s Selecting previously unselected package libcc1-0:amd64. 108s Preparing to unpack .../06-libcc1-0_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking libcc1-0:amd64 (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package libgomp1:amd64. 108s Preparing to unpack .../07-libgomp1_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking libgomp1:amd64 (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package libitm1:amd64. 108s Preparing to unpack .../08-libitm1_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking libitm1:amd64 (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package libasan8:amd64. 108s Preparing to unpack .../09-libasan8_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking libasan8:amd64 (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package liblsan0:amd64. 108s Preparing to unpack .../10-liblsan0_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking liblsan0:amd64 (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package libtsan2:amd64. 108s Preparing to unpack .../11-libtsan2_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking libtsan2:amd64 (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package libubsan1:amd64. 108s Preparing to unpack .../12-libubsan1_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking libubsan1:amd64 (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package libhwasan0:amd64. 108s Preparing to unpack .../13-libhwasan0_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking libhwasan0:amd64 (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package libquadmath0:amd64. 108s Preparing to unpack .../14-libquadmath0_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking libquadmath0:amd64 (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package libgcc-14-dev:amd64. 108s Preparing to unpack .../15-libgcc-14-dev_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking libgcc-14-dev:amd64 (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package gcc-14-x86-64-linux-gnu. 108s Preparing to unpack .../16-gcc-14-x86-64-linux-gnu_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking gcc-14-x86-64-linux-gnu (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package gcc-14. 108s Preparing to unpack .../17-gcc-14_14.2.0-8ubuntu1_amd64.deb ... 108s Unpacking gcc-14 (14.2.0-8ubuntu1) ... 108s Selecting previously unselected package gcc-x86-64-linux-gnu. 108s Preparing to unpack .../18-gcc-x86-64-linux-gnu_4%3a14.1.0-2ubuntu1_amd64.deb ... 108s Unpacking gcc-x86-64-linux-gnu (4:14.1.0-2ubuntu1) ... 109s Selecting previously unselected package gcc. 109s Preparing to unpack .../19-gcc_4%3a14.1.0-2ubuntu1_amd64.deb ... 109s Unpacking gcc (4:14.1.0-2ubuntu1) ... 109s Selecting previously unselected package libstdc++-14-dev:amd64. 109s Preparing to unpack .../20-libstdc++-14-dev_14.2.0-8ubuntu1_amd64.deb ... 109s Unpacking libstdc++-14-dev:amd64 (14.2.0-8ubuntu1) ... 109s Selecting previously unselected package g++-14-x86-64-linux-gnu. 109s Preparing to unpack .../21-g++-14-x86-64-linux-gnu_14.2.0-8ubuntu1_amd64.deb ... 109s Unpacking g++-14-x86-64-linux-gnu (14.2.0-8ubuntu1) ... 109s Selecting previously unselected package g++-14. 109s Preparing to unpack .../22-g++-14_14.2.0-8ubuntu1_amd64.deb ... 109s Unpacking g++-14 (14.2.0-8ubuntu1) ... 109s Selecting previously unselected package g++-x86-64-linux-gnu. 109s Preparing to unpack .../23-g++-x86-64-linux-gnu_4%3a14.1.0-2ubuntu1_amd64.deb ... 109s Unpacking g++-x86-64-linux-gnu (4:14.1.0-2ubuntu1) ... 109s Selecting previously unselected package g++. 109s Preparing to unpack .../24-g++_4%3a14.1.0-2ubuntu1_amd64.deb ... 109s Unpacking g++ (4:14.1.0-2ubuntu1) ... 109s Selecting previously unselected package build-essential. 109s Preparing to unpack .../25-build-essential_12.10ubuntu1_amd64.deb ... 109s Unpacking build-essential (12.10ubuntu1) ... 109s Selecting previously unselected package libblas3:amd64. 109s Preparing to unpack .../26-libblas3_3.12.0-4_amd64.deb ... 109s Unpacking libblas3:amd64 (3.12.0-4) ... 109s Selecting previously unselected package libgfortran5:amd64. 109s Preparing to unpack .../27-libgfortran5_14.2.0-8ubuntu1_amd64.deb ... 109s Unpacking libgfortran5:amd64 (14.2.0-8ubuntu1) ... 109s Selecting previously unselected package liblapack3:amd64. 109s Preparing to unpack .../28-liblapack3_3.12.0-4_amd64.deb ... 109s Unpacking liblapack3:amd64 (3.12.0-4) ... 109s Selecting previously unselected package python3-dateutil. 109s Preparing to unpack .../29-python3-dateutil_2.9.0-3_all.deb ... 109s Unpacking python3-dateutil (2.9.0-3) ... 109s Selecting previously unselected package python3-numpy. 109s Preparing to unpack .../30-python3-numpy_1%3a1.26.4+ds-11ubuntu1_amd64.deb ... 109s Unpacking python3-numpy (1:1.26.4+ds-11ubuntu1) ... 109s Selecting previously unselected package python3-tz. 109s Preparing to unpack .../31-python3-tz_2024.1-2_all.deb ... 109s Unpacking python3-tz (2024.1-2) ... 109s Selecting previously unselected package python3-pandas-lib:amd64. 109s Preparing to unpack .../32-python3-pandas-lib_2.2.3+dfsg-5ubuntu1_amd64.deb ... 109s Unpacking python3-pandas-lib:amd64 (2.2.3+dfsg-5ubuntu1) ... 110s Selecting previously unselected package python3-pandas. 110s Preparing to unpack .../33-python3-pandas_2.2.3+dfsg-5ubuntu1_all.deb ... 110s Unpacking python3-pandas (2.2.3+dfsg-5ubuntu1) ... 110s Setting up libgomp1:amd64 (14.2.0-8ubuntu1) ... 110s Setting up python3-tz (2024.1-2) ... 110s Setting up libblas3:amd64 (3.12.0-4) ... 110s update-alternatives: using /usr/lib/x86_64-linux-gnu/blas/libblas.so.3 to provide /usr/lib/x86_64-linux-gnu/libblas.so.3 (libblas.so.3-x86_64-linux-gnu) in auto mode 110s Setting up libquadmath0:amd64 (14.2.0-8ubuntu1) ... 110s Setting up libmpc3:amd64 (1.3.1-1build2) ... 110s Setting up libgfortran5:amd64 (14.2.0-8ubuntu1) ... 110s Setting up libubsan1:amd64 (14.2.0-8ubuntu1) ... 110s Setting up libhwasan0:amd64 (14.2.0-8ubuntu1) ... 110s Setting up libasan8:amd64 (14.2.0-8ubuntu1) ... 110s Setting up python3-dateutil (2.9.0-3) ... 110s Setting up libtsan2:amd64 (14.2.0-8ubuntu1) ... 110s Setting up libisl23:amd64 (0.27-1) ... 110s Setting up libcc1-0:amd64 (14.2.0-8ubuntu1) ... 110s Setting up liblsan0:amd64 (14.2.0-8ubuntu1) ... 110s Setting up libitm1:amd64 (14.2.0-8ubuntu1) ... 110s Setting up liblapack3:amd64 (3.12.0-4) ... 110s update-alternatives: using /usr/lib/x86_64-linux-gnu/lapack/liblapack.so.3 to provide /usr/lib/x86_64-linux-gnu/liblapack.so.3 (liblapack.so.3-x86_64-linux-gnu) in auto mode 110s Setting up cpp-14-x86-64-linux-gnu (14.2.0-8ubuntu1) ... 110s Setting up python3-numpy (1:1.26.4+ds-11ubuntu1) ... 111s Setting up cpp-14 (14.2.0-8ubuntu1) ... 111s Setting up libgcc-14-dev:amd64 (14.2.0-8ubuntu1) ... 111s Setting up libstdc++-14-dev:amd64 (14.2.0-8ubuntu1) ... 112s Setting up cpp-x86-64-linux-gnu (4:14.1.0-2ubuntu1) ... 112s Setting up python3-pandas-lib:amd64 (2.2.3+dfsg-5ubuntu1) ... 112s Setting up python3-pandas (2.2.3+dfsg-5ubuntu1) ... 115s Setting up cpp (4:14.1.0-2ubuntu1) ... 115s Setting up gcc-14-x86-64-linux-gnu (14.2.0-8ubuntu1) ... 115s Setting up gcc-x86-64-linux-gnu (4:14.1.0-2ubuntu1) ... 115s Setting up gcc-14 (14.2.0-8ubuntu1) ... 115s Setting up g++-14-x86-64-linux-gnu (14.2.0-8ubuntu1) ... 115s Setting up g++-x86-64-linux-gnu (4:14.1.0-2ubuntu1) ... 115s Setting up g++-14 (14.2.0-8ubuntu1) ... 115s Setting up gcc (4:14.1.0-2ubuntu1) ... 115s Setting up g++ (4:14.1.0-2ubuntu1) ... 115s update-alternatives: using /usr/bin/g++ to provide /usr/bin/c++ (c++) in auto mode 115s Setting up build-essential (12.10ubuntu1) ... 115s Processing triggers for man-db (2.13.0-1) ... 116s Processing triggers for libc-bin (2.40-1ubuntu3) ... 117s Reading package lists... 117s Building dependency tree... 117s Reading state information... 117s Starting pkgProblemResolver with broken count: 0 117s Starting 2 pkgProblemResolver with broken count: 0 117s Done 118s The following NEW packages will be installed: 118s autopkgtest-satdep 118s 0 upgraded, 1 newly installed, 0 to remove and 0 not upgraded. 118s Need to get 0 B/696 B of archives. 118s After this operation, 0 B of additional disk space will be used. 118s Get:1 /tmp/autopkgtest.rcV9Ni/2-autopkgtest-satdep.deb autopkgtest-satdep amd64 0 [696 B] 118s Selecting previously unselected package autopkgtest-satdep. 118s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 79672 files and directories currently installed.) 118s Preparing to unpack .../2-autopkgtest-satdep.deb ... 118s Unpacking autopkgtest-satdep (0) ... 118s Setting up autopkgtest-satdep (0) ... 119s autopkgtest: WARNING: package python3-pandas:i386 is not installed though it should be 121s (Reading database ... 79672 files and directories currently installed.) 121s Removing autopkgtest-satdep (0) ... 122s autopkgtest [00:24:03]: test command1: cd "$AUTOPKGTEST_TMP" && python3 -c "import pandas;a=pandas.DataFrame([[1,2],[3,4]])" 122s autopkgtest [00:24:03]: test command1: [----------------------- 123s autopkgtest [00:24:04]: test command1: -----------------------] 123s command1 PASS 123s autopkgtest [00:24:04]: test command1: - - - - - - - - - - results - - - - - - - - - - 123s autopkgtest [00:24:04]: test unittests3: preparing testbed 126s Note, using file '/tmp/autopkgtest.rcV9Ni/3-autopkgtest-satdep.dsc' to get the build dependencies 126s Reading package lists... 126s Building dependency tree... 126s Reading state information... 126s Starting pkgProblemResolver with broken count: 0 126s Starting 2 pkgProblemResolver with broken count: 0 126s Done 127s The following NEW packages will be installed: 127s blt fontconfig fontconfig-config fonts-dejavu-core fonts-dejavu-mono 127s fonts-lyx libaec0 libavahi-client3 libavahi-common-data libavahi-common3 127s libblosc1 libblosc2-4 libcups2t64 libdeflate0 libdouble-conversion3 127s libdrm-amdgpu1 libdrm-intel1 libdrm-radeon1 libegl-mesa0 libegl1 127s libfontconfig1 libfontenc1 libgbm1 libgl1 libgl1-mesa-dri libglapi-mesa 127s libglvnd0 libglx-mesa0 libglx0 libgraphite2-3 libharfbuzz0b libhdf5-103-1t64 127s libice6 libimagequant0 libinput-bin libinput10 libjbig0 libjpeg-turbo8 127s libjpeg8 libjs-jquery libjs-jquery-ui liblbfgsb0 liblcms2-2 liblerc4 127s libmd4c0 libmtdev1t64 libopenjp2-7 libpciaccess0 libpcre2-16-0 libpixman-1-0 127s libpython3.13-minimal libpython3.13-stdlib libqhull-r8.0 libqt5core5t64 127s libqt5dbus5t64 libqt5designer5 libqt5gui5t64 libqt5help5 libqt5network5t64 127s libqt5printsupport5t64 libqt5sql5t64 libqt5test5t64 libqt5widgets5t64 127s libqt5xml5t64 libraqm0 libsharpyuv0 libsm6 libsnappy1v5 libsz2 libtcl8.6 127s libtiff6 libtk8.6 libvulkan1 libwacom-common libwacom9 libwayland-client0 127s libwayland-server0 libwebp7 libwebpdemux2 libwebpmux3 libx11-xcb1 libxaw7 127s libxcb-dri2-0 libxcb-dri3-0 libxcb-glx0 libxcb-icccm4 libxcb-image0 127s libxcb-keysyms1 libxcb-present0 libxcb-randr0 libxcb-render-util0 127s libxcb-render0 libxcb-shape0 libxcb-shm0 libxcb-sync1 libxcb-util1 127s libxcb-xfixes0 libxcb-xinerama0 libxcb-xinput0 libxcb-xkb1 libxfixes3 127s libxfont2 libxft2 libxkbcommon-x11-0 libxkbfile1 libxmu6 libxpm4 libxrandr2 127s libxrender1 libxshmfence1 libxslt1.1 libxss1 libxt6t64 libxxf86vm1 127s locales-all mesa-libgallium python-matplotlib-data python-tables-data 127s python3-all python3-appdirs python3-async-generator python3-bottleneck 127s python3-brotli python3-bs4 python3-click python3-cloudpickle 127s python3-colorama python3-contourpy python3-cpuinfo python3-cycler 127s python3-dask python3-decorator python3-defusedxml python3-et-xmlfile 127s python3-execnet python3-fonttools python3-fs python3-fsspec python3-greenlet 127s python3-html5lib python3-hypothesis python3-iniconfig python3-kiwisolver 127s python3-locket python3-lxml python3-lz4 python3-matplotlib python3-mpmath 127s python3-numexpr python3-odf python3-openpyxl python3-packaging python3-partd 127s python3-pil python3-pil.imagetk python3-pluggy python3-py python3-pyqt5 127s python3-pyqt5.sip python3-pyreadstat python3-pytest python3-pytest-asyncio 127s python3-pytest-forked python3-pytest-localserver python3-pytest-xdist 127s python3-pytestqt python3-scipy python3-six python3-sortedcontainers 127s python3-soupsieve python3-sqlalchemy python3-sympy python3-tables 127s python3-tables-lib python3-tabulate python3-tk python3-toolz python3-ufolib2 127s python3-webencodings python3-werkzeug python3-xarray python3-xlrd 127s python3-xlsxwriter python3-zstandard python3.12-tk python3.13 127s python3.13-minimal python3.13-tk tk8.6-blt2.5 tzdata-legacy unicode-data 127s x11-common x11-xkb-utils xsel xserver-common xvfb 127s 0 upgraded, 196 newly installed, 0 to remove and 0 not upgraded. 127s Need to get 113 MB of archives. 127s After this operation, 704 MB of additional disk space will be used. 127s Get:1 http://ftpmaster.internal/ubuntu plucky/main amd64 libpython3.13-minimal amd64 3.13.0-2 [879 kB] 127s Get:2 http://ftpmaster.internal/ubuntu plucky/main amd64 python3.13-minimal amd64 3.13.0-2 [2188 kB] 127s Get:3 http://ftpmaster.internal/ubuntu plucky/main amd64 libtcl8.6 amd64 8.6.15+dfsg-2 [1085 kB] 127s Get:4 http://ftpmaster.internal/ubuntu plucky/main amd64 fonts-dejavu-mono all 2.37-8 [502 kB] 127s Get:5 http://ftpmaster.internal/ubuntu plucky/main amd64 fonts-dejavu-core all 2.37-8 [835 kB] 128s Get:6 http://ftpmaster.internal/ubuntu plucky/main amd64 fontconfig-config amd64 2.15.0-1.1ubuntu2 [37.3 kB] 128s Get:7 http://ftpmaster.internal/ubuntu plucky/main amd64 libfontconfig1 amd64 2.15.0-1.1ubuntu2 [139 kB] 128s Get:8 http://ftpmaster.internal/ubuntu plucky/main amd64 libxrender1 amd64 1:0.9.10-1.1build1 [19.0 kB] 128s Get:9 http://ftpmaster.internal/ubuntu plucky/main amd64 libxft2 amd64 2.3.6-1build1 [45.3 kB] 128s Get:10 http://ftpmaster.internal/ubuntu plucky/main amd64 x11-common all 1:7.7+23ubuntu3 [21.7 kB] 128s Get:11 http://ftpmaster.internal/ubuntu plucky/main amd64 libxss1 amd64 1:1.2.3-1build3 [7204 B] 128s Get:12 http://ftpmaster.internal/ubuntu plucky/main amd64 libtk8.6 amd64 8.6.15-1 [862 kB] 128s Get:13 http://ftpmaster.internal/ubuntu plucky/main amd64 tk8.6-blt2.5 amd64 2.5.3+dfsg-7build1 [630 kB] 128s Get:14 http://ftpmaster.internal/ubuntu plucky/main amd64 blt amd64 2.5.3+dfsg-7build1 [4840 B] 128s Get:15 http://ftpmaster.internal/ubuntu plucky/main amd64 fontconfig amd64 2.15.0-1.1ubuntu2 [180 kB] 128s Get:16 http://ftpmaster.internal/ubuntu plucky/universe amd64 fonts-lyx all 2.4.2.1-1 [171 kB] 128s Get:17 http://ftpmaster.internal/ubuntu plucky/universe amd64 libaec0 amd64 1.1.3-1 [22.7 kB] 128s Get:18 http://ftpmaster.internal/ubuntu plucky/main amd64 libavahi-common-data amd64 0.8-13ubuntu6 [29.7 kB] 128s Get:19 http://ftpmaster.internal/ubuntu plucky/main amd64 libavahi-common3 amd64 0.8-13ubuntu6 [23.3 kB] 128s Get:20 http://ftpmaster.internal/ubuntu plucky/main amd64 libavahi-client3 amd64 0.8-13ubuntu6 [26.8 kB] 128s Get:21 http://ftpmaster.internal/ubuntu plucky/main amd64 libsnappy1v5 amd64 1.2.1-1 [30.4 kB] 128s Get:22 http://ftpmaster.internal/ubuntu plucky/universe amd64 libblosc1 amd64 1.21.5+ds-1build1 [36.2 kB] 128s Get:23 http://ftpmaster.internal/ubuntu plucky/universe amd64 libblosc2-4 amd64 2.15.1+ds-1 [171 kB] 128s Get:24 http://ftpmaster.internal/ubuntu plucky/main amd64 libcups2t64 amd64 2.4.10-1ubuntu2 [271 kB] 128s Get:25 http://ftpmaster.internal/ubuntu plucky/main amd64 libdeflate0 amd64 1.22-1 [64.5 kB] 128s Get:26 http://ftpmaster.internal/ubuntu plucky/universe amd64 libdouble-conversion3 amd64 3.3.0-1build1 [40.3 kB] 128s Get:27 http://ftpmaster.internal/ubuntu plucky/main amd64 libdrm-amdgpu1 amd64 2.4.123-1 [21.7 kB] 128s Get:28 http://ftpmaster.internal/ubuntu plucky/main amd64 libpciaccess0 amd64 0.17-3build1 [18.6 kB] 128s Get:29 http://ftpmaster.internal/ubuntu plucky/main amd64 libdrm-intel1 amd64 2.4.123-1 [68.8 kB] 128s Get:30 http://ftpmaster.internal/ubuntu plucky/main amd64 libdrm-radeon1 amd64 2.4.123-1 [25.3 kB] 128s Get:31 http://ftpmaster.internal/ubuntu plucky/main amd64 libwayland-server0 amd64 1.23.0-1 [35.1 kB] 128s Get:32 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-randr0 amd64 1.17.0-2 [17.9 kB] 128s Get:33 http://ftpmaster.internal/ubuntu plucky/main amd64 libglapi-mesa amd64 24.2.3-1ubuntu1 [42.4 kB] 128s Get:34 http://ftpmaster.internal/ubuntu plucky/main amd64 libx11-xcb1 amd64 2:1.8.10-2 [7944 B] 128s Get:35 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-dri2-0 amd64 1.17.0-2 [7222 B] 128s Get:36 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-dri3-0 amd64 1.17.0-2 [7508 B] 128s Get:37 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-present0 amd64 1.17.0-2 [6064 B] 128s Get:38 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-sync1 amd64 1.17.0-2 [9312 B] 128s Get:39 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-xfixes0 amd64 1.17.0-2 [10.2 kB] 128s Get:40 http://ftpmaster.internal/ubuntu plucky/main amd64 libxshmfence1 amd64 1.3-1build5 [4764 B] 128s Get:41 http://ftpmaster.internal/ubuntu plucky/main amd64 mesa-libgallium amd64 24.2.3-1ubuntu1 [9904 kB] 128s Get:42 http://ftpmaster.internal/ubuntu plucky/main amd64 libgbm1 amd64 24.2.3-1ubuntu1 [32.0 kB] 128s Get:43 http://ftpmaster.internal/ubuntu plucky/main amd64 libwayland-client0 amd64 1.23.0-1 [27.1 kB] 128s Get:44 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-shm0 amd64 1.17.0-2 [5758 B] 128s Get:45 http://ftpmaster.internal/ubuntu plucky/main amd64 libegl-mesa0 amd64 24.2.3-1ubuntu1 [129 kB] 128s Get:46 http://ftpmaster.internal/ubuntu plucky/main amd64 libfontenc1 amd64 1:1.1.8-1build1 [14.0 kB] 128s Get:47 http://ftpmaster.internal/ubuntu plucky/main amd64 libvulkan1 amd64 1.3.296.0-1 [143 kB] 128s Get:48 http://ftpmaster.internal/ubuntu plucky/main amd64 libgl1-mesa-dri amd64 24.2.3-1ubuntu1 [34.4 kB] 128s Get:49 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-glx0 amd64 1.17.0-2 [24.8 kB] 128s Get:50 http://ftpmaster.internal/ubuntu plucky/main amd64 libxfixes3 amd64 1:6.0.0-2build1 [10.8 kB] 128s Get:51 http://ftpmaster.internal/ubuntu plucky/main amd64 libxxf86vm1 amd64 1:1.1.4-1build4 [9282 B] 128s Get:52 http://ftpmaster.internal/ubuntu plucky/main amd64 libglx-mesa0 amd64 24.2.3-1ubuntu1 [153 kB] 129s Get:53 http://ftpmaster.internal/ubuntu plucky/main amd64 libgraphite2-3 amd64 1.3.14-2ubuntu1 [73.1 kB] 129s Get:54 http://ftpmaster.internal/ubuntu plucky/main amd64 libharfbuzz0b amd64 10.0.1-1 [540 kB] 129s Get:55 http://ftpmaster.internal/ubuntu plucky/universe amd64 libsz2 amd64 1.1.3-1 [5456 B] 129s Get:56 http://ftpmaster.internal/ubuntu plucky/universe amd64 libhdf5-103-1t64 amd64 1.10.10+repack-4ubuntu3 [1280 kB] 129s Get:57 http://ftpmaster.internal/ubuntu plucky/main amd64 libice6 amd64 2:1.1.1-1 [44.1 kB] 129s Get:58 http://ftpmaster.internal/ubuntu plucky/main amd64 libimagequant0 amd64 2.18.0-1build1 [36.3 kB] 129s Get:59 http://ftpmaster.internal/ubuntu plucky/main amd64 libwacom-common all 2.13.0-1 [98.6 kB] 129s Get:60 http://ftpmaster.internal/ubuntu plucky/main amd64 libwacom9 amd64 2.13.0-1 [25.1 kB] 129s Get:61 http://ftpmaster.internal/ubuntu plucky/main amd64 libinput-bin amd64 1.26.2-1 [22.8 kB] 129s Get:62 http://ftpmaster.internal/ubuntu plucky/main amd64 libmtdev1t64 amd64 1.1.6-1.2 [14.4 kB] 129s Get:63 http://ftpmaster.internal/ubuntu plucky/main amd64 libinput10 amd64 1.26.2-1 [137 kB] 129s Get:64 http://ftpmaster.internal/ubuntu plucky/main amd64 libjpeg-turbo8 amd64 2.1.5-3ubuntu2 [179 kB] 129s Get:65 http://ftpmaster.internal/ubuntu plucky/main amd64 libjpeg8 amd64 8c-2ubuntu11 [2148 B] 129s Get:66 http://ftpmaster.internal/ubuntu plucky/main amd64 libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 129s Get:67 http://ftpmaster.internal/ubuntu plucky/universe amd64 libjs-jquery-ui all 1.13.2+dfsg-1 [252 kB] 129s Get:68 http://ftpmaster.internal/ubuntu plucky/universe amd64 liblbfgsb0 amd64 3.0+dfsg.4-1build1 [29.9 kB] 129s Get:69 http://ftpmaster.internal/ubuntu plucky/main amd64 liblcms2-2 amd64 2.16-2 [212 kB] 129s Get:70 http://ftpmaster.internal/ubuntu plucky/main amd64 liblerc4 amd64 4.0.0+ds-5ubuntu1 [271 kB] 129s Get:71 http://ftpmaster.internal/ubuntu plucky/universe amd64 libmd4c0 amd64 0.5.2-2 [50.1 kB] 129s Get:72 http://ftpmaster.internal/ubuntu plucky/main amd64 libpcre2-16-0 amd64 10.42-4ubuntu3 [214 kB] 129s Get:73 http://ftpmaster.internal/ubuntu plucky/main amd64 libpixman-1-0 amd64 0.44.0-3 [427 kB] 129s Get:74 http://ftpmaster.internal/ubuntu plucky/main amd64 libpython3.13-stdlib amd64 3.13.0-2 [2107 kB] 129s Get:75 http://ftpmaster.internal/ubuntu plucky/universe amd64 libqhull-r8.0 amd64 2020.2-6build1 [193 kB] 129s Get:76 http://ftpmaster.internal/ubuntu plucky/universe amd64 libqt5core5t64 amd64 5.15.15+dfsg-1ubuntu1 [2036 kB] 129s Get:77 http://ftpmaster.internal/ubuntu plucky/universe amd64 libqt5dbus5t64 amd64 5.15.15+dfsg-1ubuntu1 [221 kB] 129s Get:78 http://ftpmaster.internal/ubuntu plucky/main amd64 libglvnd0 amd64 1.7.0-1build1 [69.6 kB] 129s Get:79 http://ftpmaster.internal/ubuntu plucky/main amd64 libegl1 amd64 1.7.0-1build1 [28.7 kB] 129s Get:80 http://ftpmaster.internal/ubuntu plucky/main amd64 libglx0 amd64 1.7.0-1build1 [38.6 kB] 129s Get:81 http://ftpmaster.internal/ubuntu plucky/main amd64 libgl1 amd64 1.7.0-1build1 [102 kB] 129s Get:82 http://ftpmaster.internal/ubuntu plucky/universe amd64 libqt5network5t64 amd64 5.15.15+dfsg-1ubuntu1 [724 kB] 129s Get:83 http://ftpmaster.internal/ubuntu plucky/main amd64 libsm6 amd64 2:1.2.4-1 [17.4 kB] 129s Get:84 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-icccm4 amd64 0.4.2-1 [11.1 kB] 129s Get:85 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-util1 amd64 0.4.0-1build3 [10.7 kB] 129s Get:86 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-image0 amd64 0.4.0-2build1 [10.8 kB] 129s Get:87 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-keysyms1 amd64 0.4.0-1build4 [7956 B] 129s Get:88 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-render0 amd64 1.17.0-2 [16.2 kB] 129s Get:89 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-render-util0 amd64 0.3.9-1build4 [9608 B] 129s Get:90 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-shape0 amd64 1.17.0-2 [6092 B] 129s Get:91 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-xinerama0 amd64 1.17.0-2 [5412 B] 129s Get:92 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-xinput0 amd64 1.17.0-2 [33.2 kB] 129s Get:93 http://ftpmaster.internal/ubuntu plucky/main amd64 libxcb-xkb1 amd64 1.17.0-2 [32.3 kB] 129s Get:94 http://ftpmaster.internal/ubuntu plucky/main amd64 libxkbcommon-x11-0 amd64 1.7.0-1 [15.2 kB] 129s Get:95 http://ftpmaster.internal/ubuntu plucky/universe amd64 libqt5gui5t64 amd64 5.15.15+dfsg-1ubuntu1 [3791 kB] 129s Get:96 http://ftpmaster.internal/ubuntu plucky/universe amd64 libqt5widgets5t64 amd64 5.15.15+dfsg-1ubuntu1 [2561 kB] 129s Get:97 http://ftpmaster.internal/ubuntu plucky/universe amd64 libqt5xml5t64 amd64 5.15.15+dfsg-1ubuntu1 [124 kB] 129s Get:98 http://ftpmaster.internal/ubuntu plucky/universe amd64 libqt5designer5 amd64 5.15.15-2 [2827 kB] 130s Get:99 http://ftpmaster.internal/ubuntu plucky/universe amd64 libqt5sql5t64 amd64 5.15.15+dfsg-1ubuntu1 [122 kB] 130s Get:100 http://ftpmaster.internal/ubuntu plucky/universe amd64 libqt5help5 amd64 5.15.15-2 [161 kB] 130s Get:101 http://ftpmaster.internal/ubuntu plucky/universe amd64 libqt5printsupport5t64 amd64 5.15.15+dfsg-1ubuntu1 [208 kB] 130s Get:102 http://ftpmaster.internal/ubuntu plucky/universe amd64 libqt5test5t64 amd64 5.15.15+dfsg-1ubuntu1 [149 kB] 130s Get:103 http://ftpmaster.internal/ubuntu plucky/main amd64 libraqm0 amd64 0.10.1-1build1 [15.0 kB] 130s Get:104 http://ftpmaster.internal/ubuntu plucky/main amd64 libsharpyuv0 amd64 1.4.0-0.1 [17.5 kB] 130s Get:105 http://ftpmaster.internal/ubuntu plucky/main amd64 libjbig0 amd64 2.1-6.1ubuntu2 [29.7 kB] 130s Get:106 http://ftpmaster.internal/ubuntu plucky/main amd64 libwebp7 amd64 1.4.0-0.1 [231 kB] 130s Get:107 http://ftpmaster.internal/ubuntu plucky/main amd64 libtiff6 amd64 4.5.1+git230720-4ubuntu4 [200 kB] 130s Get:108 http://ftpmaster.internal/ubuntu plucky/main amd64 libwebpdemux2 amd64 1.4.0-0.1 [12.4 kB] 130s Get:109 http://ftpmaster.internal/ubuntu plucky/main amd64 libwebpmux3 amd64 1.4.0-0.1 [25.8 kB] 130s Get:110 http://ftpmaster.internal/ubuntu plucky/main amd64 libxt6t64 amd64 1:1.2.1-1.2build1 [171 kB] 130s Get:111 http://ftpmaster.internal/ubuntu plucky/main amd64 libxmu6 amd64 2:1.1.3-3build2 [47.6 kB] 130s Get:112 http://ftpmaster.internal/ubuntu plucky/main amd64 libxpm4 amd64 1:3.5.17-1build2 [36.5 kB] 130s Get:113 http://ftpmaster.internal/ubuntu plucky/main amd64 libxaw7 amd64 2:1.0.16-1 [207 kB] 130s Get:114 http://ftpmaster.internal/ubuntu plucky/main amd64 libxfont2 amd64 1:2.0.6-1build1 [93.0 kB] 130s Get:115 http://ftpmaster.internal/ubuntu plucky/main amd64 libxkbfile1 amd64 1:1.1.0-1build4 [70.0 kB] 130s Get:116 http://ftpmaster.internal/ubuntu plucky/main amd64 libxrandr2 amd64 2:1.5.4-1 [19.6 kB] 130s Get:117 http://ftpmaster.internal/ubuntu plucky/main amd64 libxslt1.1 amd64 1.1.39-0exp1ubuntu1 [169 kB] 130s Get:118 http://ftpmaster.internal/ubuntu plucky/universe amd64 python-matplotlib-data all 3.8.3-3ubuntu1 [2928 kB] 130s Get:119 http://ftpmaster.internal/ubuntu plucky/universe amd64 python-tables-data all 3.10.1-1build1 [49.4 kB] 130s Get:120 http://ftpmaster.internal/ubuntu plucky/main amd64 python3.13 amd64 3.13.0-2 [719 kB] 130s Get:121 http://ftpmaster.internal/ubuntu plucky-proposed/main amd64 python3-all amd64 3.12.7-1 [890 B] 130s Get:122 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-appdirs all 1.4.4-4 [10.9 kB] 130s Get:123 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-async-generator all 1.10-4 [17.5 kB] 130s Get:124 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-bottleneck amd64 1.3.8+ds1-1build1 [89.5 kB] 130s Get:125 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-brotli amd64 1.1.0-2build3 [368 kB] 130s Get:126 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-soupsieve all 2.6-1 [33.0 kB] 130s Get:127 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-bs4 all 4.12.3-3 [109 kB] 130s Get:128 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-colorama all 0.4.6-4 [32.1 kB] 130s Get:129 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-click all 8.1.7-2 [79.5 kB] 130s Get:130 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-cloudpickle all 3.0.0-2 [21.5 kB] 130s Get:131 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-contourpy amd64 1.3.0-2build1 [271 kB] 130s Get:132 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-cpuinfo all 9.0.0+git20221119-2 [21.6 kB] 130s Get:133 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-cycler all 0.12.1-1 [9716 B] 130s Get:134 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-fsspec all 2024.9.0-1 [207 kB] 130s Get:135 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-toolz all 1.0.0-1 [44.9 kB] 130s Get:136 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-packaging all 24.2-1 [51.5 kB] 130s Get:137 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-locket all 1.0.0-2 [5872 B] 131s Get:138 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-partd all 1.4.2-1 [15.7 kB] 131s Get:139 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-dask all 2024.5.2+dfsg-1 [849 kB] 131s Get:140 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-decorator all 5.1.1-5 [10.1 kB] 131s Get:141 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-defusedxml all 0.7.1-2 [42.0 kB] 131s Get:142 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-et-xmlfile all 2.0.0-1 [79.4 kB] 131s Get:143 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-execnet all 2.1.1-1 [33.4 kB] 131s Get:144 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-six all 1.16.0-7 [13.1 kB] 131s Get:145 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-fs all 2.4.16-4 [91.3 kB] 131s Get:146 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-lxml amd64 5.3.0-1build1 [1834 kB] 131s Get:147 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-lz4 amd64 4.0.2+dfsg-1build5 [27.5 kB] 131s Get:148 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-scipy amd64 1.13.1-5ubuntu1 [21.7 MB] 132s Get:149 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-mpmath all 1.3.0-1 [425 kB] 132s Get:150 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-sympy all 1.13.3-1 [4228 kB] 132s Get:151 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-ufolib2 all 0.16.1+dfsg1-1 [33.4 kB] 132s Get:152 http://ftpmaster.internal/ubuntu plucky/universe amd64 unicode-data all 15.1.0-1 [8878 kB] 132s Get:153 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-fonttools amd64 4.55.0-3 [1759 kB] 132s Get:154 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-webencodings all 0.5.1-5 [11.5 kB] 132s Get:155 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-html5lib all 1.2-1 [90.8 kB] 132s Get:156 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-sortedcontainers all 2.4.0-2 [27.6 kB] 132s Get:157 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-hypothesis all 6.119.3-1 [329 kB] 132s Get:158 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-iniconfig all 1.1.1-2 [6024 B] 132s Get:159 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-kiwisolver amd64 1.4.7-2build1 [70.3 kB] 132s Get:160 http://ftpmaster.internal/ubuntu plucky/main amd64 libopenjp2-7 amd64 2.5.0-2ubuntu1 [184 kB] 132s Get:161 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-pil amd64 10.4.0-1ubuntu2 [580 kB] 132s Get:162 http://ftpmaster.internal/ubuntu plucky/main amd64 python3.12-tk amd64 3.12.7-3 [116 kB] 132s Get:163 http://ftpmaster.internal/ubuntu plucky/main amd64 python3.13-tk amd64 3.13.0-2 [106 kB] 132s Get:164 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-tk amd64 3.12.7-1 [9752 B] 132s Get:165 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pil.imagetk amd64 10.4.0-1ubuntu2 [9568 B] 132s Get:166 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-matplotlib amd64 3.8.3-3ubuntu1 [4691 kB] 132s Get:167 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-numexpr amd64 2.10.1-2build1 [132 kB] 132s Get:168 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-odf all 1.4.2-3 [78.9 kB] 132s Get:169 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-openpyxl all 3.1.5+dfsg-1 [152 kB] 132s Get:170 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pluggy all 1.5.0-1 [21.0 kB] 132s Get:171 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-py all 1.11.0-2 [72.7 kB] 132s Get:172 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pyqt5.sip amd64 12.15.0-1build1 [78.3 kB] 132s Get:173 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pyqt5 amd64 5.15.11+dfsg-1build1 [2726 kB] 132s Get:174 http://ftpmaster.internal/ubuntu plucky-proposed/universe amd64 python3-pyreadstat amd64 1.2.8-1 [638 kB] 132s Get:175 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pytest all 8.3.3-1 [251 kB] 132s Get:176 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pytest-asyncio all 0.20.3-1.3 [10.7 kB] 132s Get:177 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pytest-forked all 1.6.0-2 [7382 B] 132s Get:178 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-werkzeug all 3.0.4-1ubuntu1 [171 kB] 132s Get:179 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pytest-localserver all 0.8.1-2 [22.9 kB] 132s Get:180 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pytest-xdist all 3.6.1-1 [33.8 kB] 132s Get:181 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pytestqt all 4.3.1-1 [37.9 kB] 132s Get:182 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-greenlet amd64 3.1.0-1 [183 kB] 132s Get:183 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-sqlalchemy all 2.0.32+ds1-1ubuntu3 [1206 kB] 133s Get:184 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-tables-lib amd64 3.10.1-1build1 [864 kB] 133s Get:185 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-tables all 3.10.1-1build1 [354 kB] 133s Get:186 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-tabulate all 0.9.0-1 [45.3 kB] 133s Get:187 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-xarray all 2024.09.0-1 [780 kB] 133s Get:188 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-xlrd all 2.0.1-2 [83.1 kB] 133s Get:189 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-xlsxwriter all 3.1.9-1 [512 kB] 133s Get:190 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-zstandard amd64 0.23.0-2build1 [419 kB] 133s Get:191 http://ftpmaster.internal/ubuntu plucky/main amd64 tzdata-legacy all 2024b-1ubuntu2 [99.9 kB] 133s Get:192 http://ftpmaster.internal/ubuntu plucky/main amd64 x11-xkb-utils amd64 7.7+9 [169 kB] 133s Get:193 http://ftpmaster.internal/ubuntu plucky/universe amd64 xsel amd64 1.2.1-1 [20.5 kB] 133s Get:194 http://ftpmaster.internal/ubuntu plucky/main amd64 xserver-common all 2:21.1.14-2ubuntu1 [33.7 kB] 133s Get:195 http://ftpmaster.internal/ubuntu plucky/universe amd64 xvfb amd64 2:21.1.14-2ubuntu1 [965 kB] 133s Get:196 http://ftpmaster.internal/ubuntu plucky/universe amd64 locales-all amd64 2.40-1ubuntu3 [11.0 MB] 134s Fetched 113 MB in 6s (17.7 MB/s) 134s Selecting previously unselected package libpython3.13-minimal:amd64. 134s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 79672 files and directories currently installed.) 134s Preparing to unpack .../000-libpython3.13-minimal_3.13.0-2_amd64.deb ... 134s Unpacking libpython3.13-minimal:amd64 (3.13.0-2) ... 134s Selecting previously unselected package python3.13-minimal. 134s Preparing to unpack .../001-python3.13-minimal_3.13.0-2_amd64.deb ... 134s Unpacking python3.13-minimal (3.13.0-2) ... 134s Selecting previously unselected package libtcl8.6:amd64. 134s Preparing to unpack .../002-libtcl8.6_8.6.15+dfsg-2_amd64.deb ... 134s Unpacking libtcl8.6:amd64 (8.6.15+dfsg-2) ... 134s Selecting previously unselected package fonts-dejavu-mono. 134s Preparing to unpack .../003-fonts-dejavu-mono_2.37-8_all.deb ... 134s Unpacking fonts-dejavu-mono (2.37-8) ... 134s Selecting previously unselected package fonts-dejavu-core. 134s Preparing to unpack .../004-fonts-dejavu-core_2.37-8_all.deb ... 134s Unpacking fonts-dejavu-core (2.37-8) ... 134s Selecting previously unselected package fontconfig-config. 134s Preparing to unpack .../005-fontconfig-config_2.15.0-1.1ubuntu2_amd64.deb ... 134s Unpacking fontconfig-config (2.15.0-1.1ubuntu2) ... 134s Selecting previously unselected package libfontconfig1:amd64. 134s Preparing to unpack .../006-libfontconfig1_2.15.0-1.1ubuntu2_amd64.deb ... 134s Unpacking libfontconfig1:amd64 (2.15.0-1.1ubuntu2) ... 134s Selecting previously unselected package libxrender1:amd64. 134s Preparing to unpack .../007-libxrender1_1%3a0.9.10-1.1build1_amd64.deb ... 134s Unpacking libxrender1:amd64 (1:0.9.10-1.1build1) ... 134s Selecting previously unselected package libxft2:amd64. 134s Preparing to unpack .../008-libxft2_2.3.6-1build1_amd64.deb ... 134s Unpacking libxft2:amd64 (2.3.6-1build1) ... 134s Selecting previously unselected package x11-common. 134s Preparing to unpack .../009-x11-common_1%3a7.7+23ubuntu3_all.deb ... 134s Unpacking x11-common (1:7.7+23ubuntu3) ... 134s Selecting previously unselected package libxss1:amd64. 134s Preparing to unpack .../010-libxss1_1%3a1.2.3-1build3_amd64.deb ... 134s Unpacking libxss1:amd64 (1:1.2.3-1build3) ... 134s Selecting previously unselected package libtk8.6:amd64. 134s Preparing to unpack .../011-libtk8.6_8.6.15-1_amd64.deb ... 134s Unpacking libtk8.6:amd64 (8.6.15-1) ... 134s Selecting previously unselected package tk8.6-blt2.5. 134s Preparing to unpack .../012-tk8.6-blt2.5_2.5.3+dfsg-7build1_amd64.deb ... 134s Unpacking tk8.6-blt2.5 (2.5.3+dfsg-7build1) ... 134s Selecting previously unselected package blt. 134s Preparing to unpack .../013-blt_2.5.3+dfsg-7build1_amd64.deb ... 134s Unpacking blt (2.5.3+dfsg-7build1) ... 134s Selecting previously unselected package fontconfig. 134s Preparing to unpack .../014-fontconfig_2.15.0-1.1ubuntu2_amd64.deb ... 134s Unpacking fontconfig (2.15.0-1.1ubuntu2) ... 134s Selecting previously unselected package fonts-lyx. 134s Preparing to unpack .../015-fonts-lyx_2.4.2.1-1_all.deb ... 134s Unpacking fonts-lyx (2.4.2.1-1) ... 134s Selecting previously unselected package libaec0:amd64. 134s Preparing to unpack .../016-libaec0_1.1.3-1_amd64.deb ... 134s Unpacking libaec0:amd64 (1.1.3-1) ... 134s Selecting previously unselected package libavahi-common-data:amd64. 134s Preparing to unpack .../017-libavahi-common-data_0.8-13ubuntu6_amd64.deb ... 134s Unpacking libavahi-common-data:amd64 (0.8-13ubuntu6) ... 134s Selecting previously unselected package libavahi-common3:amd64. 134s Preparing to unpack .../018-libavahi-common3_0.8-13ubuntu6_amd64.deb ... 134s Unpacking libavahi-common3:amd64 (0.8-13ubuntu6) ... 134s Selecting previously unselected package libavahi-client3:amd64. 134s Preparing to unpack .../019-libavahi-client3_0.8-13ubuntu6_amd64.deb ... 134s Unpacking libavahi-client3:amd64 (0.8-13ubuntu6) ... 134s Selecting previously unselected package libsnappy1v5:amd64. 134s Preparing to unpack .../020-libsnappy1v5_1.2.1-1_amd64.deb ... 134s Unpacking libsnappy1v5:amd64 (1.2.1-1) ... 134s Selecting previously unselected package libblosc1:amd64. 134s Preparing to unpack .../021-libblosc1_1.21.5+ds-1build1_amd64.deb ... 134s Unpacking libblosc1:amd64 (1.21.5+ds-1build1) ... 134s Selecting previously unselected package libblosc2-4:amd64. 134s Preparing to unpack .../022-libblosc2-4_2.15.1+ds-1_amd64.deb ... 134s Unpacking libblosc2-4:amd64 (2.15.1+ds-1) ... 135s Selecting previously unselected package libcups2t64:amd64. 135s Preparing to unpack .../023-libcups2t64_2.4.10-1ubuntu2_amd64.deb ... 135s Unpacking libcups2t64:amd64 (2.4.10-1ubuntu2) ... 135s Selecting previously unselected package libdeflate0:amd64. 135s Preparing to unpack .../024-libdeflate0_1.22-1_amd64.deb ... 135s Unpacking libdeflate0:amd64 (1.22-1) ... 135s Selecting previously unselected package libdouble-conversion3:amd64. 135s Preparing to unpack .../025-libdouble-conversion3_3.3.0-1build1_amd64.deb ... 135s Unpacking libdouble-conversion3:amd64 (3.3.0-1build1) ... 135s Selecting previously unselected package libdrm-amdgpu1:amd64. 135s Preparing to unpack .../026-libdrm-amdgpu1_2.4.123-1_amd64.deb ... 135s Unpacking libdrm-amdgpu1:amd64 (2.4.123-1) ... 135s Selecting previously unselected package libpciaccess0:amd64. 135s Preparing to unpack .../027-libpciaccess0_0.17-3build1_amd64.deb ... 135s Unpacking libpciaccess0:amd64 (0.17-3build1) ... 135s Selecting previously unselected package libdrm-intel1:amd64. 135s Preparing to unpack .../028-libdrm-intel1_2.4.123-1_amd64.deb ... 135s Unpacking libdrm-intel1:amd64 (2.4.123-1) ... 135s Selecting previously unselected package libdrm-radeon1:amd64. 135s Preparing to unpack .../029-libdrm-radeon1_2.4.123-1_amd64.deb ... 135s Unpacking libdrm-radeon1:amd64 (2.4.123-1) ... 135s Selecting previously unselected package libwayland-server0:amd64. 135s Preparing to unpack .../030-libwayland-server0_1.23.0-1_amd64.deb ... 135s Unpacking libwayland-server0:amd64 (1.23.0-1) ... 135s Selecting previously unselected package libxcb-randr0:amd64. 135s Preparing to unpack .../031-libxcb-randr0_1.17.0-2_amd64.deb ... 135s Unpacking libxcb-randr0:amd64 (1.17.0-2) ... 135s Selecting previously unselected package libglapi-mesa:amd64. 135s Preparing to unpack .../032-libglapi-mesa_24.2.3-1ubuntu1_amd64.deb ... 135s Unpacking libglapi-mesa:amd64 (24.2.3-1ubuntu1) ... 135s Selecting previously unselected package libx11-xcb1:amd64. 135s Preparing to unpack .../033-libx11-xcb1_2%3a1.8.10-2_amd64.deb ... 135s Unpacking libx11-xcb1:amd64 (2:1.8.10-2) ... 135s Selecting previously unselected package libxcb-dri2-0:amd64. 135s Preparing to unpack .../034-libxcb-dri2-0_1.17.0-2_amd64.deb ... 135s Unpacking libxcb-dri2-0:amd64 (1.17.0-2) ... 135s Selecting previously unselected package libxcb-dri3-0:amd64. 135s Preparing to unpack .../035-libxcb-dri3-0_1.17.0-2_amd64.deb ... 135s Unpacking libxcb-dri3-0:amd64 (1.17.0-2) ... 135s Selecting previously unselected package libxcb-present0:amd64. 135s Preparing to unpack .../036-libxcb-present0_1.17.0-2_amd64.deb ... 135s Unpacking libxcb-present0:amd64 (1.17.0-2) ... 135s Selecting previously unselected package libxcb-sync1:amd64. 135s Preparing to unpack .../037-libxcb-sync1_1.17.0-2_amd64.deb ... 135s Unpacking libxcb-sync1:amd64 (1.17.0-2) ... 135s Selecting previously unselected package libxcb-xfixes0:amd64. 135s Preparing to unpack .../038-libxcb-xfixes0_1.17.0-2_amd64.deb ... 135s Unpacking libxcb-xfixes0:amd64 (1.17.0-2) ... 135s Selecting previously unselected package libxshmfence1:amd64. 135s Preparing to unpack .../039-libxshmfence1_1.3-1build5_amd64.deb ... 135s Unpacking libxshmfence1:amd64 (1.3-1build5) ... 135s Selecting previously unselected package mesa-libgallium:amd64. 135s Preparing to unpack .../040-mesa-libgallium_24.2.3-1ubuntu1_amd64.deb ... 135s Unpacking mesa-libgallium:amd64 (24.2.3-1ubuntu1) ... 135s Selecting previously unselected package libgbm1:amd64. 135s Preparing to unpack .../041-libgbm1_24.2.3-1ubuntu1_amd64.deb ... 135s Unpacking libgbm1:amd64 (24.2.3-1ubuntu1) ... 135s Selecting previously unselected package libwayland-client0:amd64. 135s Preparing to unpack .../042-libwayland-client0_1.23.0-1_amd64.deb ... 135s Unpacking libwayland-client0:amd64 (1.23.0-1) ... 135s Selecting previously unselected package libxcb-shm0:amd64. 135s Preparing to unpack .../043-libxcb-shm0_1.17.0-2_amd64.deb ... 135s Unpacking libxcb-shm0:amd64 (1.17.0-2) ... 135s Selecting previously unselected package libegl-mesa0:amd64. 135s Preparing to unpack .../044-libegl-mesa0_24.2.3-1ubuntu1_amd64.deb ... 135s Unpacking libegl-mesa0:amd64 (24.2.3-1ubuntu1) ... 135s Selecting previously unselected package libfontenc1:amd64. 135s Preparing to unpack .../045-libfontenc1_1%3a1.1.8-1build1_amd64.deb ... 135s Unpacking libfontenc1:amd64 (1:1.1.8-1build1) ... 135s Selecting previously unselected package libvulkan1:amd64. 135s Preparing to unpack .../046-libvulkan1_1.3.296.0-1_amd64.deb ... 135s Unpacking libvulkan1:amd64 (1.3.296.0-1) ... 135s Selecting previously unselected package libgl1-mesa-dri:amd64. 135s Preparing to unpack .../047-libgl1-mesa-dri_24.2.3-1ubuntu1_amd64.deb ... 135s Unpacking libgl1-mesa-dri:amd64 (24.2.3-1ubuntu1) ... 135s Selecting previously unselected package libxcb-glx0:amd64. 135s Preparing to unpack .../048-libxcb-glx0_1.17.0-2_amd64.deb ... 135s Unpacking libxcb-glx0:amd64 (1.17.0-2) ... 135s Selecting previously unselected package libxfixes3:amd64. 135s Preparing to unpack .../049-libxfixes3_1%3a6.0.0-2build1_amd64.deb ... 135s Unpacking libxfixes3:amd64 (1:6.0.0-2build1) ... 135s Selecting previously unselected package libxxf86vm1:amd64. 135s Preparing to unpack .../050-libxxf86vm1_1%3a1.1.4-1build4_amd64.deb ... 135s Unpacking libxxf86vm1:amd64 (1:1.1.4-1build4) ... 135s Selecting previously unselected package libglx-mesa0:amd64. 135s Preparing to unpack .../051-libglx-mesa0_24.2.3-1ubuntu1_amd64.deb ... 135s Unpacking libglx-mesa0:amd64 (24.2.3-1ubuntu1) ... 135s Selecting previously unselected package libgraphite2-3:amd64. 135s Preparing to unpack .../052-libgraphite2-3_1.3.14-2ubuntu1_amd64.deb ... 135s Unpacking libgraphite2-3:amd64 (1.3.14-2ubuntu1) ... 135s Selecting previously unselected package libharfbuzz0b:amd64. 135s Preparing to unpack .../053-libharfbuzz0b_10.0.1-1_amd64.deb ... 135s Unpacking libharfbuzz0b:amd64 (10.0.1-1) ... 135s Selecting previously unselected package libsz2:amd64. 135s Preparing to unpack .../054-libsz2_1.1.3-1_amd64.deb ... 135s Unpacking libsz2:amd64 (1.1.3-1) ... 135s Selecting previously unselected package libhdf5-103-1t64:amd64. 135s Preparing to unpack .../055-libhdf5-103-1t64_1.10.10+repack-4ubuntu3_amd64.deb ... 135s Unpacking libhdf5-103-1t64:amd64 (1.10.10+repack-4ubuntu3) ... 135s Selecting previously unselected package libice6:amd64. 135s Preparing to unpack .../056-libice6_2%3a1.1.1-1_amd64.deb ... 135s Unpacking libice6:amd64 (2:1.1.1-1) ... 135s Selecting previously unselected package libimagequant0:amd64. 135s Preparing to unpack .../057-libimagequant0_2.18.0-1build1_amd64.deb ... 135s Unpacking libimagequant0:amd64 (2.18.0-1build1) ... 135s Selecting previously unselected package libwacom-common. 135s Preparing to unpack .../058-libwacom-common_2.13.0-1_all.deb ... 135s Unpacking libwacom-common (2.13.0-1) ... 136s Selecting previously unselected package libwacom9:amd64. 136s Preparing to unpack .../059-libwacom9_2.13.0-1_amd64.deb ... 136s Unpacking libwacom9:amd64 (2.13.0-1) ... 136s Selecting previously unselected package libinput-bin. 136s Preparing to unpack .../060-libinput-bin_1.26.2-1_amd64.deb ... 136s Unpacking libinput-bin (1.26.2-1) ... 136s Selecting previously unselected package libmtdev1t64:amd64. 136s Preparing to unpack .../061-libmtdev1t64_1.1.6-1.2_amd64.deb ... 136s Unpacking libmtdev1t64:amd64 (1.1.6-1.2) ... 136s Selecting previously unselected package libinput10:amd64. 136s Preparing to unpack .../062-libinput10_1.26.2-1_amd64.deb ... 136s Unpacking libinput10:amd64 (1.26.2-1) ... 136s Selecting previously unselected package libjpeg-turbo8:amd64. 136s Preparing to unpack .../063-libjpeg-turbo8_2.1.5-3ubuntu2_amd64.deb ... 136s Unpacking libjpeg-turbo8:amd64 (2.1.5-3ubuntu2) ... 136s Selecting previously unselected package libjpeg8:amd64. 136s Preparing to unpack .../064-libjpeg8_8c-2ubuntu11_amd64.deb ... 136s Unpacking libjpeg8:amd64 (8c-2ubuntu11) ... 136s Selecting previously unselected package libjs-jquery. 136s Preparing to unpack .../065-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 136s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 136s Selecting previously unselected package libjs-jquery-ui. 136s Preparing to unpack .../066-libjs-jquery-ui_1.13.2+dfsg-1_all.deb ... 136s Unpacking libjs-jquery-ui (1.13.2+dfsg-1) ... 136s Selecting previously unselected package liblbfgsb0:amd64. 136s Preparing to unpack .../067-liblbfgsb0_3.0+dfsg.4-1build1_amd64.deb ... 136s Unpacking liblbfgsb0:amd64 (3.0+dfsg.4-1build1) ... 136s Selecting previously unselected package liblcms2-2:amd64. 136s Preparing to unpack .../068-liblcms2-2_2.16-2_amd64.deb ... 136s Unpacking liblcms2-2:amd64 (2.16-2) ... 136s Selecting previously unselected package liblerc4:amd64. 136s Preparing to unpack .../069-liblerc4_4.0.0+ds-5ubuntu1_amd64.deb ... 136s Unpacking liblerc4:amd64 (4.0.0+ds-5ubuntu1) ... 136s Selecting previously unselected package libmd4c0:amd64. 136s Preparing to unpack .../070-libmd4c0_0.5.2-2_amd64.deb ... 136s Unpacking libmd4c0:amd64 (0.5.2-2) ... 136s Selecting previously unselected package libpcre2-16-0:amd64. 136s Preparing to unpack .../071-libpcre2-16-0_10.42-4ubuntu3_amd64.deb ... 136s Unpacking libpcre2-16-0:amd64 (10.42-4ubuntu3) ... 136s Selecting previously unselected package libpixman-1-0:amd64. 136s Preparing to unpack .../072-libpixman-1-0_0.44.0-3_amd64.deb ... 136s Unpacking libpixman-1-0:amd64 (0.44.0-3) ... 136s Selecting previously unselected package libpython3.13-stdlib:amd64. 136s Preparing to unpack .../073-libpython3.13-stdlib_3.13.0-2_amd64.deb ... 136s Unpacking libpython3.13-stdlib:amd64 (3.13.0-2) ... 136s Selecting previously unselected package libqhull-r8.0:amd64. 136s Preparing to unpack .../074-libqhull-r8.0_2020.2-6build1_amd64.deb ... 136s Unpacking libqhull-r8.0:amd64 (2020.2-6build1) ... 136s Selecting previously unselected package libqt5core5t64:amd64. 136s Preparing to unpack .../075-libqt5core5t64_5.15.15+dfsg-1ubuntu1_amd64.deb ... 136s Unpacking libqt5core5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 136s Selecting previously unselected package libqt5dbus5t64:amd64. 136s Preparing to unpack .../076-libqt5dbus5t64_5.15.15+dfsg-1ubuntu1_amd64.deb ... 136s Unpacking libqt5dbus5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 136s Selecting previously unselected package libglvnd0:amd64. 136s Preparing to unpack .../077-libglvnd0_1.7.0-1build1_amd64.deb ... 136s Unpacking libglvnd0:amd64 (1.7.0-1build1) ... 136s Selecting previously unselected package libegl1:amd64. 136s Preparing to unpack .../078-libegl1_1.7.0-1build1_amd64.deb ... 136s Unpacking libegl1:amd64 (1.7.0-1build1) ... 136s Selecting previously unselected package libglx0:amd64. 136s Preparing to unpack .../079-libglx0_1.7.0-1build1_amd64.deb ... 136s Unpacking libglx0:amd64 (1.7.0-1build1) ... 136s Selecting previously unselected package libgl1:amd64. 136s Preparing to unpack .../080-libgl1_1.7.0-1build1_amd64.deb ... 136s Unpacking libgl1:amd64 (1.7.0-1build1) ... 136s Selecting previously unselected package libqt5network5t64:amd64. 136s Preparing to unpack .../081-libqt5network5t64_5.15.15+dfsg-1ubuntu1_amd64.deb ... 136s Unpacking libqt5network5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 136s Selecting previously unselected package libsm6:amd64. 136s Preparing to unpack .../082-libsm6_2%3a1.2.4-1_amd64.deb ... 136s Unpacking libsm6:amd64 (2:1.2.4-1) ... 136s Selecting previously unselected package libxcb-icccm4:amd64. 136s Preparing to unpack .../083-libxcb-icccm4_0.4.2-1_amd64.deb ... 136s Unpacking libxcb-icccm4:amd64 (0.4.2-1) ... 136s Selecting previously unselected package libxcb-util1:amd64. 136s Preparing to unpack .../084-libxcb-util1_0.4.0-1build3_amd64.deb ... 136s Unpacking libxcb-util1:amd64 (0.4.0-1build3) ... 136s Selecting previously unselected package libxcb-image0:amd64. 136s Preparing to unpack .../085-libxcb-image0_0.4.0-2build1_amd64.deb ... 136s Unpacking libxcb-image0:amd64 (0.4.0-2build1) ... 136s Selecting previously unselected package libxcb-keysyms1:amd64. 136s Preparing to unpack .../086-libxcb-keysyms1_0.4.0-1build4_amd64.deb ... 136s Unpacking libxcb-keysyms1:amd64 (0.4.0-1build4) ... 136s Selecting previously unselected package libxcb-render0:amd64. 136s Preparing to unpack .../087-libxcb-render0_1.17.0-2_amd64.deb ... 136s Unpacking libxcb-render0:amd64 (1.17.0-2) ... 136s Selecting previously unselected package libxcb-render-util0:amd64. 136s Preparing to unpack .../088-libxcb-render-util0_0.3.9-1build4_amd64.deb ... 136s Unpacking libxcb-render-util0:amd64 (0.3.9-1build4) ... 136s Selecting previously unselected package libxcb-shape0:amd64. 137s Preparing to unpack .../089-libxcb-shape0_1.17.0-2_amd64.deb ... 137s Unpacking libxcb-shape0:amd64 (1.17.0-2) ... 137s Selecting previously unselected package libxcb-xinerama0:amd64. 137s Preparing to unpack .../090-libxcb-xinerama0_1.17.0-2_amd64.deb ... 137s Unpacking libxcb-xinerama0:amd64 (1.17.0-2) ... 137s Selecting previously unselected package libxcb-xinput0:amd64. 137s Preparing to unpack .../091-libxcb-xinput0_1.17.0-2_amd64.deb ... 137s Unpacking libxcb-xinput0:amd64 (1.17.0-2) ... 137s Selecting previously unselected package libxcb-xkb1:amd64. 137s Preparing to unpack .../092-libxcb-xkb1_1.17.0-2_amd64.deb ... 137s Unpacking libxcb-xkb1:amd64 (1.17.0-2) ... 137s Selecting previously unselected package libxkbcommon-x11-0:amd64. 137s Preparing to unpack .../093-libxkbcommon-x11-0_1.7.0-1_amd64.deb ... 137s Unpacking libxkbcommon-x11-0:amd64 (1.7.0-1) ... 137s Selecting previously unselected package libqt5gui5t64:amd64. 137s Preparing to unpack .../094-libqt5gui5t64_5.15.15+dfsg-1ubuntu1_amd64.deb ... 137s Unpacking libqt5gui5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 137s Selecting previously unselected package libqt5widgets5t64:amd64. 137s Preparing to unpack .../095-libqt5widgets5t64_5.15.15+dfsg-1ubuntu1_amd64.deb ... 137s Unpacking libqt5widgets5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 137s Selecting previously unselected package libqt5xml5t64:amd64. 137s Preparing to unpack .../096-libqt5xml5t64_5.15.15+dfsg-1ubuntu1_amd64.deb ... 137s Unpacking libqt5xml5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 137s Selecting previously unselected package libqt5designer5:amd64. 137s Preparing to unpack .../097-libqt5designer5_5.15.15-2_amd64.deb ... 137s Unpacking libqt5designer5:amd64 (5.15.15-2) ... 137s Selecting previously unselected package libqt5sql5t64:amd64. 137s Preparing to unpack .../098-libqt5sql5t64_5.15.15+dfsg-1ubuntu1_amd64.deb ... 137s Unpacking libqt5sql5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 137s Selecting previously unselected package libqt5help5:amd64. 137s Preparing to unpack .../099-libqt5help5_5.15.15-2_amd64.deb ... 137s Unpacking libqt5help5:amd64 (5.15.15-2) ... 137s Selecting previously unselected package libqt5printsupport5t64:amd64. 137s Preparing to unpack .../100-libqt5printsupport5t64_5.15.15+dfsg-1ubuntu1_amd64.deb ... 137s Unpacking libqt5printsupport5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 137s Selecting previously unselected package libqt5test5t64:amd64. 137s Preparing to unpack .../101-libqt5test5t64_5.15.15+dfsg-1ubuntu1_amd64.deb ... 137s Unpacking libqt5test5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 137s Selecting previously unselected package libraqm0:amd64. 137s Preparing to unpack .../102-libraqm0_0.10.1-1build1_amd64.deb ... 137s Unpacking libraqm0:amd64 (0.10.1-1build1) ... 137s Selecting previously unselected package libsharpyuv0:amd64. 137s Preparing to unpack .../103-libsharpyuv0_1.4.0-0.1_amd64.deb ... 137s Unpacking libsharpyuv0:amd64 (1.4.0-0.1) ... 137s Selecting previously unselected package libjbig0:amd64. 137s Preparing to unpack .../104-libjbig0_2.1-6.1ubuntu2_amd64.deb ... 137s Unpacking libjbig0:amd64 (2.1-6.1ubuntu2) ... 137s Selecting previously unselected package libwebp7:amd64. 137s Preparing to unpack .../105-libwebp7_1.4.0-0.1_amd64.deb ... 137s Unpacking libwebp7:amd64 (1.4.0-0.1) ... 137s Selecting previously unselected package libtiff6:amd64. 137s Preparing to unpack .../106-libtiff6_4.5.1+git230720-4ubuntu4_amd64.deb ... 137s Unpacking libtiff6:amd64 (4.5.1+git230720-4ubuntu4) ... 137s Selecting previously unselected package libwebpdemux2:amd64. 137s Preparing to unpack .../107-libwebpdemux2_1.4.0-0.1_amd64.deb ... 137s Unpacking libwebpdemux2:amd64 (1.4.0-0.1) ... 137s Selecting previously unselected package libwebpmux3:amd64. 137s Preparing to unpack .../108-libwebpmux3_1.4.0-0.1_amd64.deb ... 137s Unpacking libwebpmux3:amd64 (1.4.0-0.1) ... 137s Selecting previously unselected package libxt6t64:amd64. 137s Preparing to unpack .../109-libxt6t64_1%3a1.2.1-1.2build1_amd64.deb ... 137s Unpacking libxt6t64:amd64 (1:1.2.1-1.2build1) ... 137s Selecting previously unselected package libxmu6:amd64. 137s Preparing to unpack .../110-libxmu6_2%3a1.1.3-3build2_amd64.deb ... 137s Unpacking libxmu6:amd64 (2:1.1.3-3build2) ... 137s Selecting previously unselected package libxpm4:amd64. 137s Preparing to unpack .../111-libxpm4_1%3a3.5.17-1build2_amd64.deb ... 137s Unpacking libxpm4:amd64 (1:3.5.17-1build2) ... 137s Selecting previously unselected package libxaw7:amd64. 137s Preparing to unpack .../112-libxaw7_2%3a1.0.16-1_amd64.deb ... 137s Unpacking libxaw7:amd64 (2:1.0.16-1) ... 137s Selecting previously unselected package libxfont2:amd64. 137s Preparing to unpack .../113-libxfont2_1%3a2.0.6-1build1_amd64.deb ... 137s Unpacking libxfont2:amd64 (1:2.0.6-1build1) ... 137s Selecting previously unselected package libxkbfile1:amd64. 137s Preparing to unpack .../114-libxkbfile1_1%3a1.1.0-1build4_amd64.deb ... 137s Unpacking libxkbfile1:amd64 (1:1.1.0-1build4) ... 137s Selecting previously unselected package libxrandr2:amd64. 137s Preparing to unpack .../115-libxrandr2_2%3a1.5.4-1_amd64.deb ... 137s Unpacking libxrandr2:amd64 (2:1.5.4-1) ... 137s Selecting previously unselected package libxslt1.1:amd64. 137s Preparing to unpack .../116-libxslt1.1_1.1.39-0exp1ubuntu1_amd64.deb ... 137s Unpacking libxslt1.1:amd64 (1.1.39-0exp1ubuntu1) ... 137s Selecting previously unselected package python-matplotlib-data. 137s Preparing to unpack .../117-python-matplotlib-data_3.8.3-3ubuntu1_all.deb ... 137s Unpacking python-matplotlib-data (3.8.3-3ubuntu1) ... 137s Selecting previously unselected package python-tables-data. 137s Preparing to unpack .../118-python-tables-data_3.10.1-1build1_all.deb ... 137s Unpacking python-tables-data (3.10.1-1build1) ... 137s Selecting previously unselected package python3.13. 137s Preparing to unpack .../119-python3.13_3.13.0-2_amd64.deb ... 137s Unpacking python3.13 (3.13.0-2) ... 137s Selecting previously unselected package python3-all. 137s Preparing to unpack .../120-python3-all_3.12.7-1_amd64.deb ... 137s Unpacking python3-all (3.12.7-1) ... 137s Selecting previously unselected package python3-appdirs. 137s Preparing to unpack .../121-python3-appdirs_1.4.4-4_all.deb ... 137s Unpacking python3-appdirs (1.4.4-4) ... 137s Selecting previously unselected package python3-async-generator. 137s Preparing to unpack .../122-python3-async-generator_1.10-4_all.deb ... 137s Unpacking python3-async-generator (1.10-4) ... 137s Selecting previously unselected package python3-bottleneck. 137s Preparing to unpack .../123-python3-bottleneck_1.3.8+ds1-1build1_amd64.deb ... 137s Unpacking python3-bottleneck (1.3.8+ds1-1build1) ... 137s Selecting previously unselected package python3-brotli. 138s Preparing to unpack .../124-python3-brotli_1.1.0-2build3_amd64.deb ... 138s Unpacking python3-brotli (1.1.0-2build3) ... 138s Selecting previously unselected package python3-soupsieve. 138s Preparing to unpack .../125-python3-soupsieve_2.6-1_all.deb ... 138s Unpacking python3-soupsieve (2.6-1) ... 138s Selecting previously unselected package python3-bs4. 138s Preparing to unpack .../126-python3-bs4_4.12.3-3_all.deb ... 138s Unpacking python3-bs4 (4.12.3-3) ... 138s Selecting previously unselected package python3-colorama. 138s Preparing to unpack .../127-python3-colorama_0.4.6-4_all.deb ... 138s Unpacking python3-colorama (0.4.6-4) ... 138s Selecting previously unselected package python3-click. 138s Preparing to unpack .../128-python3-click_8.1.7-2_all.deb ... 138s Unpacking python3-click (8.1.7-2) ... 138s Selecting previously unselected package python3-cloudpickle. 138s Preparing to unpack .../129-python3-cloudpickle_3.0.0-2_all.deb ... 138s Unpacking python3-cloudpickle (3.0.0-2) ... 138s Selecting previously unselected package python3-contourpy. 138s Preparing to unpack .../130-python3-contourpy_1.3.0-2build1_amd64.deb ... 138s Unpacking python3-contourpy (1.3.0-2build1) ... 138s Selecting previously unselected package python3-cpuinfo. 138s Preparing to unpack .../131-python3-cpuinfo_9.0.0+git20221119-2_all.deb ... 138s Unpacking python3-cpuinfo (9.0.0+git20221119-2) ... 138s Selecting previously unselected package python3-cycler. 138s Preparing to unpack .../132-python3-cycler_0.12.1-1_all.deb ... 138s Unpacking python3-cycler (0.12.1-1) ... 138s Selecting previously unselected package python3-fsspec. 138s Preparing to unpack .../133-python3-fsspec_2024.9.0-1_all.deb ... 138s Unpacking python3-fsspec (2024.9.0-1) ... 138s Selecting previously unselected package python3-toolz. 138s Preparing to unpack .../134-python3-toolz_1.0.0-1_all.deb ... 138s Unpacking python3-toolz (1.0.0-1) ... 138s Selecting previously unselected package python3-packaging. 138s Preparing to unpack .../135-python3-packaging_24.2-1_all.deb ... 138s Unpacking python3-packaging (24.2-1) ... 138s Selecting previously unselected package python3-locket. 138s Preparing to unpack .../136-python3-locket_1.0.0-2_all.deb ... 138s Unpacking python3-locket (1.0.0-2) ... 138s Selecting previously unselected package python3-partd. 138s Preparing to unpack .../137-python3-partd_1.4.2-1_all.deb ... 138s Unpacking python3-partd (1.4.2-1) ... 138s Selecting previously unselected package python3-dask. 138s Preparing to unpack .../138-python3-dask_2024.5.2+dfsg-1_all.deb ... 138s Unpacking python3-dask (2024.5.2+dfsg-1) ... 138s Selecting previously unselected package python3-decorator. 138s Preparing to unpack .../139-python3-decorator_5.1.1-5_all.deb ... 138s Unpacking python3-decorator (5.1.1-5) ... 138s Selecting previously unselected package python3-defusedxml. 138s Preparing to unpack .../140-python3-defusedxml_0.7.1-2_all.deb ... 138s Unpacking python3-defusedxml (0.7.1-2) ... 138s Selecting previously unselected package python3-et-xmlfile. 138s Preparing to unpack .../141-python3-et-xmlfile_2.0.0-1_all.deb ... 138s Unpacking python3-et-xmlfile (2.0.0-1) ... 138s Selecting previously unselected package python3-execnet. 138s Preparing to unpack .../142-python3-execnet_2.1.1-1_all.deb ... 138s Unpacking python3-execnet (2.1.1-1) ... 138s Selecting previously unselected package python3-six. 138s Preparing to unpack .../143-python3-six_1.16.0-7_all.deb ... 138s Unpacking python3-six (1.16.0-7) ... 138s Selecting previously unselected package python3-fs. 138s Preparing to unpack .../144-python3-fs_2.4.16-4_all.deb ... 138s Unpacking python3-fs (2.4.16-4) ... 138s Selecting previously unselected package python3-lxml:amd64. 138s Preparing to unpack .../145-python3-lxml_5.3.0-1build1_amd64.deb ... 138s Unpacking python3-lxml:amd64 (5.3.0-1build1) ... 138s Selecting previously unselected package python3-lz4. 138s Preparing to unpack .../146-python3-lz4_4.0.2+dfsg-1build5_amd64.deb ... 138s Unpacking python3-lz4 (4.0.2+dfsg-1build5) ... 138s Selecting previously unselected package python3-scipy. 138s Preparing to unpack .../147-python3-scipy_1.13.1-5ubuntu1_amd64.deb ... 138s Unpacking python3-scipy (1.13.1-5ubuntu1) ... 139s Selecting previously unselected package python3-mpmath. 139s Preparing to unpack .../148-python3-mpmath_1.3.0-1_all.deb ... 139s Unpacking python3-mpmath (1.3.0-1) ... 139s Selecting previously unselected package python3-sympy. 139s Preparing to unpack .../149-python3-sympy_1.13.3-1_all.deb ... 139s Unpacking python3-sympy (1.13.3-1) ... 139s Selecting previously unselected package python3-ufolib2. 139s Preparing to unpack .../150-python3-ufolib2_0.16.1+dfsg1-1_all.deb ... 139s Unpacking python3-ufolib2 (0.16.1+dfsg1-1) ... 139s Selecting previously unselected package unicode-data. 139s Preparing to unpack .../151-unicode-data_15.1.0-1_all.deb ... 139s Unpacking unicode-data (15.1.0-1) ... 139s Selecting previously unselected package python3-fonttools. 139s Preparing to unpack .../152-python3-fonttools_4.55.0-3_amd64.deb ... 139s Unpacking python3-fonttools (4.55.0-3) ... 139s Selecting previously unselected package python3-webencodings. 139s Preparing to unpack .../153-python3-webencodings_0.5.1-5_all.deb ... 139s Unpacking python3-webencodings (0.5.1-5) ... 139s Selecting previously unselected package python3-html5lib. 139s Preparing to unpack .../154-python3-html5lib_1.2-1_all.deb ... 139s Unpacking python3-html5lib (1.2-1) ... 139s Selecting previously unselected package python3-sortedcontainers. 139s Preparing to unpack .../155-python3-sortedcontainers_2.4.0-2_all.deb ... 139s Unpacking python3-sortedcontainers (2.4.0-2) ... 139s Selecting previously unselected package python3-hypothesis. 139s Preparing to unpack .../156-python3-hypothesis_6.119.3-1_all.deb ... 139s Unpacking python3-hypothesis (6.119.3-1) ... 139s Selecting previously unselected package python3-iniconfig. 139s Preparing to unpack .../157-python3-iniconfig_1.1.1-2_all.deb ... 139s Unpacking python3-iniconfig (1.1.1-2) ... 139s Selecting previously unselected package python3-kiwisolver. 139s Preparing to unpack .../158-python3-kiwisolver_1.4.7-2build1_amd64.deb ... 139s Unpacking python3-kiwisolver (1.4.7-2build1) ... 139s Selecting previously unselected package libopenjp2-7:amd64. 139s Preparing to unpack .../159-libopenjp2-7_2.5.0-2ubuntu1_amd64.deb ... 139s Unpacking libopenjp2-7:amd64 (2.5.0-2ubuntu1) ... 139s Selecting previously unselected package python3-pil:amd64. 139s Preparing to unpack .../160-python3-pil_10.4.0-1ubuntu2_amd64.deb ... 139s Unpacking python3-pil:amd64 (10.4.0-1ubuntu2) ... 140s Selecting previously unselected package python3.12-tk. 140s Preparing to unpack .../161-python3.12-tk_3.12.7-3_amd64.deb ... 140s Unpacking python3.12-tk (3.12.7-3) ... 140s Selecting previously unselected package python3.13-tk. 140s Preparing to unpack .../162-python3.13-tk_3.13.0-2_amd64.deb ... 140s Unpacking python3.13-tk (3.13.0-2) ... 140s Selecting previously unselected package python3-tk:amd64. 140s Preparing to unpack .../163-python3-tk_3.12.7-1_amd64.deb ... 140s Unpacking python3-tk:amd64 (3.12.7-1) ... 140s Selecting previously unselected package python3-pil.imagetk:amd64. 140s Preparing to unpack .../164-python3-pil.imagetk_10.4.0-1ubuntu2_amd64.deb ... 140s Unpacking python3-pil.imagetk:amd64 (10.4.0-1ubuntu2) ... 140s Selecting previously unselected package python3-matplotlib. 140s Preparing to unpack .../165-python3-matplotlib_3.8.3-3ubuntu1_amd64.deb ... 140s Unpacking python3-matplotlib (3.8.3-3ubuntu1) ... 140s Selecting previously unselected package python3-numexpr. 140s Preparing to unpack .../166-python3-numexpr_2.10.1-2build1_amd64.deb ... 140s Unpacking python3-numexpr (2.10.1-2build1) ... 140s Selecting previously unselected package python3-odf. 140s Preparing to unpack .../167-python3-odf_1.4.2-3_all.deb ... 140s Unpacking python3-odf (1.4.2-3) ... 140s Selecting previously unselected package python3-openpyxl. 140s Preparing to unpack .../168-python3-openpyxl_3.1.5+dfsg-1_all.deb ... 140s Unpacking python3-openpyxl (3.1.5+dfsg-1) ... 140s Selecting previously unselected package python3-pluggy. 140s Preparing to unpack .../169-python3-pluggy_1.5.0-1_all.deb ... 140s Unpacking python3-pluggy (1.5.0-1) ... 140s Selecting previously unselected package python3-py. 140s Preparing to unpack .../170-python3-py_1.11.0-2_all.deb ... 140s Unpacking python3-py (1.11.0-2) ... 140s Selecting previously unselected package python3-pyqt5.sip. 140s Preparing to unpack .../171-python3-pyqt5.sip_12.15.0-1build1_amd64.deb ... 140s Unpacking python3-pyqt5.sip (12.15.0-1build1) ... 140s Selecting previously unselected package python3-pyqt5. 140s Preparing to unpack .../172-python3-pyqt5_5.15.11+dfsg-1build1_amd64.deb ... 140s Unpacking python3-pyqt5 (5.15.11+dfsg-1build1) ... 140s Selecting previously unselected package python3-pyreadstat. 140s Preparing to unpack .../173-python3-pyreadstat_1.2.8-1_amd64.deb ... 140s Unpacking python3-pyreadstat (1.2.8-1) ... 140s Selecting previously unselected package python3-pytest. 140s Preparing to unpack .../174-python3-pytest_8.3.3-1_all.deb ... 140s Unpacking python3-pytest (8.3.3-1) ... 140s Selecting previously unselected package python3-pytest-asyncio. 140s Preparing to unpack .../175-python3-pytest-asyncio_0.20.3-1.3_all.deb ... 140s Unpacking python3-pytest-asyncio (0.20.3-1.3) ... 140s Selecting previously unselected package python3-pytest-forked. 140s Preparing to unpack .../176-python3-pytest-forked_1.6.0-2_all.deb ... 140s Unpacking python3-pytest-forked (1.6.0-2) ... 140s Selecting previously unselected package python3-werkzeug. 140s Preparing to unpack .../177-python3-werkzeug_3.0.4-1ubuntu1_all.deb ... 140s Unpacking python3-werkzeug (3.0.4-1ubuntu1) ... 140s Selecting previously unselected package python3-pytest-localserver. 140s Preparing to unpack .../178-python3-pytest-localserver_0.8.1-2_all.deb ... 140s Unpacking python3-pytest-localserver (0.8.1-2) ... 140s Selecting previously unselected package python3-pytest-xdist. 140s Preparing to unpack .../179-python3-pytest-xdist_3.6.1-1_all.deb ... 140s Unpacking python3-pytest-xdist (3.6.1-1) ... 140s Selecting previously unselected package python3-pytestqt. 140s Preparing to unpack .../180-python3-pytestqt_4.3.1-1_all.deb ... 140s Unpacking python3-pytestqt (4.3.1-1) ... 140s Selecting previously unselected package python3-greenlet. 140s Preparing to unpack .../181-python3-greenlet_3.1.0-1_amd64.deb ... 140s Unpacking python3-greenlet (3.1.0-1) ... 140s Selecting previously unselected package python3-sqlalchemy. 140s Preparing to unpack .../182-python3-sqlalchemy_2.0.32+ds1-1ubuntu3_all.deb ... 140s Unpacking python3-sqlalchemy (2.0.32+ds1-1ubuntu3) ... 140s Selecting previously unselected package python3-tables-lib. 140s Preparing to unpack .../183-python3-tables-lib_3.10.1-1build1_amd64.deb ... 140s Unpacking python3-tables-lib (3.10.1-1build1) ... 140s Selecting previously unselected package python3-tables. 140s Preparing to unpack .../184-python3-tables_3.10.1-1build1_all.deb ... 140s Unpacking python3-tables (3.10.1-1build1) ... 140s Selecting previously unselected package python3-tabulate. 140s Preparing to unpack .../185-python3-tabulate_0.9.0-1_all.deb ... 140s Unpacking python3-tabulate (0.9.0-1) ... 140s Selecting previously unselected package python3-xarray. 140s Preparing to unpack .../186-python3-xarray_2024.09.0-1_all.deb ... 140s Unpacking python3-xarray (2024.09.0-1) ... 141s Selecting previously unselected package python3-xlrd. 141s Preparing to unpack .../187-python3-xlrd_2.0.1-2_all.deb ... 141s Unpacking python3-xlrd (2.0.1-2) ... 141s Selecting previously unselected package python3-xlsxwriter. 141s Preparing to unpack .../188-python3-xlsxwriter_3.1.9-1_all.deb ... 141s Unpacking python3-xlsxwriter (3.1.9-1) ... 141s Selecting previously unselected package python3-zstandard. 141s Preparing to unpack .../189-python3-zstandard_0.23.0-2build1_amd64.deb ... 141s Unpacking python3-zstandard (0.23.0-2build1) ... 141s Selecting previously unselected package tzdata-legacy. 141s Preparing to unpack .../190-tzdata-legacy_2024b-1ubuntu2_all.deb ... 141s Unpacking tzdata-legacy (2024b-1ubuntu2) ... 141s Selecting previously unselected package x11-xkb-utils. 141s Preparing to unpack .../191-x11-xkb-utils_7.7+9_amd64.deb ... 141s Unpacking x11-xkb-utils (7.7+9) ... 141s Selecting previously unselected package xsel. 141s Preparing to unpack .../192-xsel_1.2.1-1_amd64.deb ... 141s Unpacking xsel (1.2.1-1) ... 141s Selecting previously unselected package xserver-common. 141s Preparing to unpack .../193-xserver-common_2%3a21.1.14-2ubuntu1_all.deb ... 141s Unpacking xserver-common (2:21.1.14-2ubuntu1) ... 141s Selecting previously unselected package xvfb. 141s Preparing to unpack .../194-xvfb_2%3a21.1.14-2ubuntu1_amd64.deb ... 141s Unpacking xvfb (2:21.1.14-2ubuntu1) ... 141s Selecting previously unselected package locales-all. 141s Preparing to unpack .../195-locales-all_2.40-1ubuntu3_amd64.deb ... 141s Unpacking locales-all (2.40-1ubuntu3) ... 142s Setting up libgraphite2-3:amd64 (1.3.14-2ubuntu1) ... 142s Setting up xsel (1.2.1-1) ... 142s Setting up libxcb-dri3-0:amd64 (1.17.0-2) ... 142s Setting up liblcms2-2:amd64 (2.16-2) ... 142s Setting up python3-iniconfig (1.1.1-2) ... 143s Setting up libpixman-1-0:amd64 (0.44.0-3) ... 143s Setting up libsharpyuv0:amd64 (1.4.0-0.1) ... 143s Setting up libwayland-server0:amd64 (1.23.0-1) ... 143s Setting up libx11-xcb1:amd64 (2:1.8.10-2) ... 143s Setting up libpciaccess0:amd64 (0.17-3build1) ... 143s Setting up libdouble-conversion3:amd64 (3.3.0-1build1) ... 143s Setting up libxcb-xfixes0:amd64 (1.17.0-2) ... 143s Setting up liblerc4:amd64 (4.0.0+ds-5ubuntu1) ... 143s Setting up libxpm4:amd64 (1:3.5.17-1build2) ... 143s Setting up python3-async-generator (1.10-4) ... 143s Setting up libxcb-xinput0:amd64 (1.17.0-2) ... 143s Setting up libxrender1:amd64 (1:0.9.10-1.1build1) ... 143s Setting up python3-py (1.11.0-2) ... 143s Setting up python3-colorama (0.4.6-4) ... 143s Setting up python3-lz4 (4.0.2+dfsg-1build5) ... 143s Setting up libxcb-render0:amd64 (1.17.0-2) ... 143s Setting up python3-defusedxml (0.7.1-2) ... 143s Setting up libdrm-radeon1:amd64 (2.4.123-1) ... 143s Setting up libglvnd0:amd64 (1.7.0-1build1) ... 143s Setting up fonts-lyx (2.4.2.1-1) ... 143s Setting up libxcb-glx0:amd64 (1.17.0-2) ... 143s Setting up python3-fsspec (2024.9.0-1) ... 144s Setting up libdrm-intel1:amd64 (2.4.123-1) ... 144s Setting up libxcb-keysyms1:amd64 (0.4.0-1build4) ... 144s Setting up libxcb-shape0:amd64 (1.17.0-2) ... 144s Setting up x11-common (1:7.7+23ubuntu3) ... 144s Setting up libdeflate0:amd64 (1.22-1) ... 144s Setting up python3-tabulate (0.9.0-1) ... 144s Setting up libqhull-r8.0:amd64 (2020.2-6build1) ... 144s Setting up libxcb-render-util0:amd64 (0.3.9-1build4) ... 144s Setting up libxcb-shm0:amd64 (1.17.0-2) ... 144s Setting up libxcb-icccm4:amd64 (0.4.2-1) ... 144s Setting up python3-sortedcontainers (2.4.0-2) ... 145s Setting up python3-click (8.1.7-2) ... 145s Setting up libjbig0:amd64 (2.1-6.1ubuntu2) ... 145s Setting up python3-webencodings (0.5.1-5) ... 145s Setting up python3-pyreadstat (1.2.8-1) ... 145s Setting up locales-all (2.40-1ubuntu3) ... 145s Setting up libpcre2-16-0:amd64 (10.42-4ubuntu3) ... 145s Setting up libaec0:amd64 (1.1.3-1) ... 145s Setting up tzdata-legacy (2024b-1ubuntu2) ... 145s Setting up libxcb-util1:amd64 (0.4.0-1build3) ... 145s Setting up libxxf86vm1:amd64 (1:1.1.4-1build4) ... 145s Setting up python3-cloudpickle (3.0.0-2) ... 145s Setting up libsnappy1v5:amd64 (1.2.1-1) ... 145s Setting up libxcb-xkb1:amd64 (1.17.0-2) ... 145s Setting up libxcb-image0:amd64 (0.4.0-2build1) ... 145s Setting up libxcb-present0:amd64 (1.17.0-2) ... 145s Setting up unicode-data (15.1.0-1) ... 145s Setting up python3-six (1.16.0-7) ... 145s Setting up libpython3.13-minimal:amd64 (3.13.0-2) ... 145s Setting up libqt5core5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 145s Setting up python3-decorator (5.1.1-5) ... 145s Setting up libblosc2-4:amd64 (2.15.1+ds-1) ... 145s Setting up libfontenc1:amd64 (1:1.1.8-1build1) ... 145s Setting up python3-zstandard (0.23.0-2build1) ... 146s Setting up python3-packaging (24.2-1) ... 146s Setting up libxcb-xinerama0:amd64 (1.17.0-2) ... 146s Setting up python3-xlsxwriter (3.1.9-1) ... 146s Setting up libxfixes3:amd64 (1:6.0.0-2build1) ... 146s Setting up libxcb-sync1:amd64 (1.17.0-2) ... 146s Setting up python3-brotli (1.1.0-2build3) ... 146s Setting up libavahi-common-data:amd64 (0.8-13ubuntu6) ... 146s Setting up python3-greenlet (3.1.0-1) ... 147s Setting up python3-cycler (0.12.1-1) ... 147s Setting up libimagequant0:amd64 (2.18.0-1build1) ... 147s Setting up libxkbcommon-x11-0:amd64 (1.7.0-1) ... 147s Setting up fonts-dejavu-mono (2.37-8) ... 147s Setting up python3-kiwisolver (1.4.7-2build1) ... 147s Setting up python3-bottleneck (1.3.8+ds1-1build1) ... 147s Setting up libxrandr2:amd64 (2:1.5.4-1) ... 147s Setting up libtcl8.6:amd64 (8.6.15+dfsg-2) ... 147s Setting up fonts-dejavu-core (2.37-8) ... 147s Setting up python3-numexpr (2.10.1-2build1) ... 147s Setting up libjpeg-turbo8:amd64 (2.1.5-3ubuntu2) ... 147s Setting up python3-cpuinfo (9.0.0+git20221119-2) ... 147s Setting up python3-html5lib (1.2-1) ... 148s Setting up libglapi-mesa:amd64 (24.2.3-1ubuntu1) ... 148s Setting up libvulkan1:amd64 (1.3.296.0-1) ... 148s Setting up python3-pluggy (1.5.0-1) ... 148s Setting up libwebp7:amd64 (1.4.0-0.1) ... 148s Setting up libxcb-dri2-0:amd64 (1.17.0-2) ... 148s Setting up python3-pyqt5.sip (12.15.0-1build1) ... 148s Setting up libmtdev1t64:amd64 (1.1.6-1.2) ... 148s Setting up libxshmfence1:amd64 (1.3-1build5) ... 148s Setting up libxcb-randr0:amd64 (1.17.0-2) ... 148s Setting up libxslt1.1:amd64 (1.1.39-0exp1ubuntu1) ... 148s Setting up libblosc1:amd64 (1.21.5+ds-1build1) ... 148s Setting up python3-et-xmlfile (2.0.0-1) ... 148s Setting up libqt5sql5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 148s Setting up libmd4c0:amd64 (0.5.2-2) ... 148s Setting up python3-xlrd (2.0.1-2) ... 148s Setting up libopenjp2-7:amd64 (2.5.0-2ubuntu1) ... 148s Setting up python3.13-minimal (3.13.0-2) ... 149s Setting up python3-toolz (1.0.0-1) ... 149s Setting up libharfbuzz0b:amd64 (10.0.1-1) ... 149s Setting up python3-contourpy (1.3.0-2build1) ... 149s Setting up libxss1:amd64 (1:1.2.3-1build3) ... 149s Setting up libxkbfile1:amd64 (1:1.1.0-1build4) ... 149s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 149s Setting up python3-mpmath (1.3.0-1) ... 150s Setting up python3-execnet (2.1.1-1) ... 150s Setting up python-matplotlib-data (3.8.3-3ubuntu1) ... 150s Setting up libwebpmux3:amd64 (1.4.0-0.1) ... 150s Setting up python3-locket (1.0.0-2) ... 150s Setting up python3-appdirs (1.4.4-4) ... 150s Setting up libxfont2:amd64 (1:2.0.6-1build1) ... 150s Setting up libpython3.13-stdlib:amd64 (3.13.0-2) ... 150s Setting up python3-soupsieve (2.6-1) ... 151s Setting up python-tables-data (3.10.1-1build1) ... 151s Setting up libsz2:amd64 (1.1.3-1) ... 151s Setting up liblbfgsb0:amd64 (3.0+dfsg.4-1build1) ... 151s Setting up python3-odf (1.4.2-3) ... 151s Setting up libdrm-amdgpu1:amd64 (2.4.123-1) ... 151s Setting up libwacom-common (2.13.0-1) ... 151s Setting up libwayland-client0:amd64 (1.23.0-1) ... 151s Setting up libjpeg8:amd64 (8c-2ubuntu11) ... 151s Setting up python3-partd (1.4.2-1) ... 151s Setting up python3-sympy (1.13.3-1) ... 160s Setting up libice6:amd64 (2:1.1.1-1) ... 160s Setting up mesa-libgallium:amd64 (24.2.3-1ubuntu1) ... 160s Setting up libqt5dbus5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 160s Setting up python3-scipy (1.13.1-5ubuntu1) ... 165s Setting up libgbm1:amd64 (24.2.3-1ubuntu1) ... 165s Setting up python3.13 (3.13.0-2) ... 166s Setting up libwacom9:amd64 (2.13.0-1) ... 166s Setting up fontconfig-config (2.15.0-1.1ubuntu2) ... 166s Setting up python3-pytest (8.3.3-1) ... 167s Setting up libwebpdemux2:amd64 (1.4.0-0.1) ... 167s Setting up python3-hypothesis (6.119.3-1) ... 167s Setting up python3-xarray (2024.09.0-1) ... 169s Setting up libgl1-mesa-dri:amd64 (24.2.3-1ubuntu1) ... 169s Setting up libqt5network5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 169s Setting up libavahi-common3:amd64 (0.8-13ubuntu6) ... 169s Setting up python3-dask (2024.5.2+dfsg-1) ... 171s Setting up libqt5xml5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 171s Setting up python3-all (3.12.7-1) ... 171s Setting up libqt5test5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 171s Setting up python3-bs4 (4.12.3-3) ... 171s Setting up python3-sqlalchemy (2.0.32+ds1-1ubuntu3) ... 173s Setting up libinput-bin (1.26.2-1) ... 173s Setting up python3-fs (2.4.16-4) ... 173s Setting up python3-pytest-forked (1.6.0-2) ... 173s Setting up libegl-mesa0:amd64 (24.2.3-1ubuntu1) ... 173s Setting up libjs-jquery-ui (1.13.2+dfsg-1) ... 173s Setting up python3-werkzeug (3.0.4-1ubuntu1) ... 174s Setting up libraqm0:amd64 (0.10.1-1build1) ... 174s Setting up python3-pytest-asyncio (0.20.3-1.3) ... 174s Setting up python3-lxml:amd64 (5.3.0-1build1) ... 174s Setting up libtiff6:amd64 (4.5.1+git230720-4ubuntu4) ... 174s Setting up libegl1:amd64 (1.7.0-1build1) ... 174s Setting up libfontconfig1:amd64 (2.15.0-1.1ubuntu2) ... 174s Setting up libhdf5-103-1t64:amd64 (1.10.10+repack-4ubuntu3) ... 174s Setting up libsm6:amd64 (2:1.2.4-1) ... 174s Setting up python3-pytestqt (4.3.1-1) ... 174s Setting up libavahi-client3:amd64 (0.8-13ubuntu6) ... 174s Setting up libinput10:amd64 (1.26.2-1) ... 174s Setting up fontconfig (2.15.0-1.1ubuntu2) ... 176s Regenerating fonts cache... done. 176s Setting up libxft2:amd64 (2.3.6-1build1) ... 176s Setting up libglx-mesa0:amd64 (24.2.3-1ubuntu1) ... 176s Setting up python3-pytest-xdist (3.6.1-1) ... 176s Setting up libglx0:amd64 (1.7.0-1build1) ... 176s Setting up python3-tables-lib (3.10.1-1build1) ... 176s Setting up python3-pytest-localserver (0.8.1-2) ... 177s Setting up libtk8.6:amd64 (8.6.15-1) ... 177s Setting up python3.12-tk (3.12.7-3) ... 177s Setting up python3-tables (3.10.1-1build1) ... 178s Setting up python3.13-tk (3.13.0-2) ... 178s Setting up python3-pil:amd64 (10.4.0-1ubuntu2) ... 178s Setting up libgl1:amd64 (1.7.0-1build1) ... 178s Setting up python3-openpyxl (3.1.5+dfsg-1) ... 179s Setting up libxt6t64:amd64 (1:1.2.1-1.2build1) ... 179s Setting up libcups2t64:amd64 (2.4.10-1ubuntu2) ... 179s Setting up tk8.6-blt2.5 (2.5.3+dfsg-7build1) ... 179s Setting up libxmu6:amd64 (2:1.1.3-3build2) ... 179s Setting up blt (2.5.3+dfsg-7build1) ... 179s Setting up python3-tk:amd64 (3.12.7-1) ... 179s Setting up libxaw7:amd64 (2:1.0.16-1) ... 179s Setting up libqt5gui5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 179s Setting up python3-pil.imagetk:amd64 (10.4.0-1ubuntu2) ... 179s Setting up libqt5widgets5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 179s Setting up x11-xkb-utils (7.7+9) ... 179s Setting up libqt5help5:amd64 (5.15.15-2) ... 179s Setting up xserver-common (2:21.1.14-2ubuntu1) ... 179s Setting up libqt5printsupport5t64:amd64 (5.15.15+dfsg-1ubuntu1) ... 179s Setting up libqt5designer5:amd64 (5.15.15-2) ... 179s Setting up xvfb (2:21.1.14-2ubuntu1) ... 179s Setting up python3-pyqt5 (5.15.11+dfsg-1build1) ... 179s Setting up python3-fonttools (4.55.0-3) ... 180s Setting up python3-ufolib2 (0.16.1+dfsg1-1) ... 181s Setting up python3-matplotlib (3.8.3-3ubuntu1) ... 183s Processing triggers for libc-bin (2.40-1ubuntu3) ... 183s Processing triggers for systemd (256.5-2ubuntu4) ... 183s Processing triggers for man-db (2.13.0-1) ... 183s Processing triggers for udev (256.5-2ubuntu4) ... 185s Reading package lists... 185s Building dependency tree... 185s Reading state information... 185s Starting pkgProblemResolver with broken count: 0 185s Starting 2 pkgProblemResolver with broken count: 0 185s Done 185s The following NEW packages will be installed: 185s autopkgtest-satdep 185s 0 upgraded, 1 newly installed, 0 to remove and 0 not upgraded. 185s Need to get 0 B/700 B of archives. 185s After this operation, 0 B of additional disk space will be used. 185s Get:1 /tmp/autopkgtest.rcV9Ni/4-autopkgtest-satdep.deb autopkgtest-satdep amd64 0 [700 B] 186s Selecting previously unselected package autopkgtest-satdep. 186s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 98795 files and directories currently installed.) 186s Preparing to unpack .../4-autopkgtest-satdep.deb ... 186s Unpacking autopkgtest-satdep (0) ... 186s Setting up autopkgtest-satdep (0) ... 186s autopkgtest: WARNING: package python3-pandas:i386 is not installed though it should be 188s (Reading database ... 98795 files and directories currently installed.) 188s Removing autopkgtest-satdep (0) ... 188s autopkgtest [00:25:09]: test unittests3: [----------------------- 189s ++ dpkg --print-architecture 189s + arch=amd64 189s ++ py3versions -s 189s + pys='python3.13 python3.12' 189s + sourcetestroot=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests 189s + tomlfile=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 189s + echo amd64 i386 189s + grep amd64 189s ++ dpkg-vendor --query vendor 189s + '[' Debian = Ubuntu ']' 189s + marker='not slow' 189s + echo amd64 189s + grep -E 'mips|hppa' 189s === python3.13 === 189s + PYTEST_WARN_IGNORE= 189s + cd /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp 189s + TEST_SUCCESS=true 189s + for py in $pys 189s + echo '=== python3.13 ===' 189s ++ python3.13 -c 'import pandas as pd; print(pd.__path__[0])' 190s + modpath=/usr/lib/python3/dist-packages/pandas 190s + for TEST_SUBSET in $modpath/tests/* 190s + echo /usr/lib/python3/dist-packages/pandas/tests/__init__.py 190s + grep -q -e __pycache__ 190s + PANDAS_CI=1 190s + LC_ALL=C.UTF-8 190s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/__init__.py 192s ============================= test session starts ============================== 192s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 192s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 192s rootdir: /usr/lib/python3/dist-packages/pandas 192s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 192s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 192s asyncio: mode=Mode.STRICT 192s collected 0 items 192s 192s =============================== warnings summary =============================== 192s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 192s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-b8pc153r' 192s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 192s 192s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 192s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-pduykyjf' 192s session.config.cache.set(STEPWISE_CACHE_DIR, []) 192s 192s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 192s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 192s ============================= 2 warnings in 0.10s ============================== 192s + test 5 == 5 192s + echo 'rdjoqkol test state = true' 192s + for TEST_SUBSET in $modpath/tests/* 192s + echo /usr/lib/python3/dist-packages/pandas/tests/__pycache__ 192s + grep -q -e __pycache__ 192s rdjoqkol test state = true 192s + echo 'rdjoqkol test state = true' 192s rdjoqkol test state = true 192s + for TEST_SUBSET in $modpath/tests/* 192s + echo /usr/lib/python3/dist-packages/pandas/tests/api 192s + grep -q -e __pycache__ 192s + PANDAS_CI=1 192s + LC_ALL=C.UTF-8 192s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/api 193s ============================= test session starts ============================== 193s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 193s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 193s rootdir: /usr/lib/python3/dist-packages/pandas 193s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 193s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 193s asyncio: mode=Mode.STRICT 193s collected 14 items 193s 194s ../../../usr/lib/python3/dist-packages/pandas/tests/api/test_api.py ............ 194s ../../../usr/lib/python3/dist-packages/pandas/tests/api/test_types.py .. 194s 194s =============================== warnings summary =============================== 194s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 194s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-4vmnku21' 194s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 194s 194s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 194s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-y0ngjnmm' 194s session.config.cache.set(STEPWISE_CACHE_DIR, []) 194s 194s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 194s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 194s ============================= slowest 30 durations ============================= 194s 194s (30 durations < 0.005s hidden. Use -vv to show these durations.) 194s ======================== 14 passed, 2 warnings in 0.10s ======================== 194s + echo 'rdjoqkol test state = true' 194s + for TEST_SUBSET in $modpath/tests/* 194s + echo /usr/lib/python3/dist-packages/pandas/tests/apply 194s + grep -q -e __pycache__ 194s rdjoqkol test state = true 194s + PANDAS_CI=1 194s + LC_ALL=C.UTF-8 194s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/apply 195s ============================= test session starts ============================== 195s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 195s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 195s rootdir: /usr/lib/python3/dist-packages/pandas 195s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 195s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 195s asyncio: mode=Mode.STRICT 195s collected 1243 items 195s 196s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_frame_apply.py .s....ssss........ss.s.s...............ss..ss.s..ss...................ssssssssssssssss.........................s..............................................s.s.s.s.............................................sss.s...........s.s..s...s....... 196s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_frame_apply_relabeling.py ..x.. 197s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_frame_transform.py ...s.s.s................................................ss..ss..ss.....x........x........x........ 197s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_invalid_arg.py ....................................................................................................................................................................................................... 197s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_numba.py sssssssssssssssssss 198s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_series_apply.py ................................x.....x....x........................................................................................ 198s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_series_apply_relabeling.py .. 198s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_series_transform.py ............ 200s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_str.py ....................xxxxx...................................................................................................................................................................................................................................................................................................................................................................................................................................................................x...........x...........x...........x...........x........ 200s 200s =============================== warnings summary =============================== 200s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 200s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-8dynibsu' 200s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 200s 200s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 200s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-fullh1jd' 200s session.config.cache.set(STEPWISE_CACHE_DIR, []) 200s 200s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 200s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 200s ============================= slowest 30 durations ============================= 200s 0.05s call tests/apply/test_frame_apply.py::test_apply_differently_indexed 200s 0.03s call tests/apply/test_frame_apply.py::test_agg_transform[axis=1] 200s 0.03s call tests/apply/test_frame_apply.py::test_agg_transform[axis='columns'] 200s 0.02s call tests/apply/test_frame_apply.py::test_agg_reduce[axis='columns'] 200s 0.02s call tests/apply/test_frame_apply.py::test_agg_reduce[axis=1] 200s 0.02s call tests/apply/test_series_apply.py::test_transform[False] 200s 0.02s call tests/apply/test_frame_transform.py::test_transform_listlike[axis='columns'-ops1-names1] 200s 0.02s call tests/apply/test_frame_transform.py::test_transform_listlike[axis=1-ops1-names1] 200s 0.02s call tests/apply/test_frame_transform.py::test_transform_listlike[axis=1-ops3-names3] 200s 0.02s call tests/apply/test_frame_transform.py::test_transform_listlike[axis='columns'-ops3-names3] 200s 0.01s call tests/apply/test_frame_transform.py::test_transform_listlike[axis=1-ops2-names2] 200s 0.01s call tests/apply/test_frame_transform.py::test_transform_listlike[axis='columns'-ops0-names0] 200s 0.01s call tests/apply/test_frame_transform.py::test_transform_listlike[axis=1-ops0-names0] 200s 0.01s call tests/apply/test_frame_transform.py::test_transform_listlike[axis='columns'-ops2-names2] 200s 0.01s call tests/apply/test_series_apply.py::test_transform[compat] 200s 0.01s call tests/apply/test_str.py::test_transform_groupby_kernel_frame[axis=1-pct_change] 200s 0.01s call tests/apply/test_frame_apply_relabeling.py::test_agg_namedtuple 200s 0.01s call tests/apply/test_str.py::test_transform_groupby_kernel_frame[axis='columns'-pct_change] 200s 0.01s call tests/apply/test_frame_apply.py::test_agg_reduce[axis=0] 200s 0.01s call tests/apply/test_frame_apply.py::test_agg_reduce[axis='index'] 200s 0.01s call tests/apply/test_frame_apply.py::test_agg_transform[axis=0] 200s 0.01s call tests/apply/test_frame_apply.py::test_agg_transform[axis='index'] 200s 0.01s call tests/apply/test_frame_apply_relabeling.py::test_agg_relabel 200s 0.01s call tests/apply/test_frame_apply.py::test_apply_mutating 200s 0.01s call tests/apply/test_series_apply.py::test_with_nested_series[agg] 200s 0.01s call tests/apply/test_frame_apply_relabeling.py::test_agg_relabel_partial_functions 200s 0.01s call tests/apply/test_series_apply.py::test_with_nested_series[apply] 200s 0.01s call tests/apply/test_frame_apply_relabeling.py::test_agg_relabel_multi_columns_multi_methods 200s 0.01s call tests/apply/test_frame_apply.py::test_nuiscance_columns 200s 0.01s call tests/apply/test_str.py::test_transform_groupby_kernel_frame[axis=0-pct_change] 200s =========== 1153 passed, 73 skipped, 17 xfailed, 2 warnings in 4.80s =========== 200s rdjoqkol test state = true 200s + echo 'rdjoqkol test state = true' 200s + for TEST_SUBSET in $modpath/tests/* 200s + echo /usr/lib/python3/dist-packages/pandas/tests/arithmetic 200s + grep -q -e __pycache__ 200s + PANDAS_CI=1 200s + LC_ALL=C.UTF-8 200s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/arithmetic 203s ============================= test session starts ============================== 203s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 203s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 203s rootdir: /usr/lib/python3/dist-packages/pandas 203s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 203s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 203s asyncio: mode=Mode.STRICT 203s collected 19330 items 203s 203s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_array_ops.py .. 203s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_categorical.py .. 244s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_datetime64.py .............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 245s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_interval.py ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 249s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_numeric.py ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s..s.....s..s............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 249s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_object.py .......s........................................................................................... 251s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_period.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 254s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_timedelta64.py .................................................................................................................................................................................................................................................................s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s.....s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s.....s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s.....s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 254s 254s =============================== warnings summary =============================== 254s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 254s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-hcimu5_b' 254s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 254s 254s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 254s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-gqy_dcdi' 254s session.config.cache.set(STEPWISE_CACHE_DIR, []) 254s 254s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 254s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 254s ============================= slowest 30 durations ============================= 254s 0.22s call tests/arithmetic/test_datetime64.py::TestDatetimeIndexArithmetic::test_dti_addsub_object_arraylike[pytz.FixedOffset(-300)-Series-Index] 254s 0.12s call tests/arithmetic/test_datetime64.py::TestDatetime64Arithmetic::test_dt64arr_sub_timedeltalike_scalar[datetime.timezone.utc-Timedelta-DataFrame] 254s 0.08s teardown tests/arithmetic/test_timedelta64.py::test_add_timestamp_to_timedelta 254s 0.04s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_relativedelta_offsets[DataFrame-ms] 254s 0.04s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_relativedelta_offsets[DataFrame-s] 254s 0.04s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_relativedelta_offsets[DataFrame-ns] 254s 0.04s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_relativedelta_offsets[DataFrame-us] 254s 0.04s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-ms-5-True-cls_and_kwargs27] 254s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-s-5-True-cls_and_kwargs27] 254s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-us-5-True-cls_and_kwargs27] 254s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-s-5-False-cls_and_kwargs27] 254s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-ms-5-False-cls_and_kwargs27] 254s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-us-5-False-cls_and_kwargs27] 254s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-ns-5-True-cls_and_kwargs27] 254s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-ns-5-False-cls_and_kwargs27] 254s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-s-5-False-cls_and_kwargs27] 254s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-ms-5-False-cls_and_kwargs27] 254s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-ms-5-True-cls_and_kwargs27] 254s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-s-5-True-cls_and_kwargs27] 254s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-ns-5-True-cls_and_kwargs27] 254s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-us-5-True-cls_and_kwargs27] 254s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-ns-5-False-cls_and_kwargs27] 254s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-us-5-False-cls_and_kwargs27] 254s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-ms-5-True-CBMonthBegin] 254s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-s-0-True-CBMonthBegin] 254s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-ms-0-True-CBMonthBegin] 254s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-s-5-True-CBMonthBegin] 254s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-us-0-True-CBMonthBegin] 254s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-us-5-True-CBMonthBegin] 254s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-s-0-True-CBMonthEnd] 254s =============== 19158 passed, 172 skipped, 2 warnings in 52.39s ================ 255s + echo 'rdjoqkol test state = true' 255s + for TEST_SUBSET in $modpath/tests/* 255s + echo /usr/lib/python3/dist-packages/pandas/tests/arrays 255s + grep -q -e __pycache__ 255s + PANDAS_CI=1 255s rdjoqkol test state = true 255s + LC_ALL=C.UTF-8 255s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/arrays 259s ============================= test session starts ============================== 259s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 259s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 259s rootdir: /usr/lib/python3/dist-packages/pandas 259s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 259s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 259s asyncio: mode=Mode.STRICT 259s collected 19230 items / 2 skipped 259s 259s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_arithmetic.py ..................... 259s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_astype.py ... 259s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_comparison.py .................................... 259s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_construction.py ............................. 259s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_function.py ........... 259s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_indexing.py ... 259s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_logical.py ................................................................................... 259s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_ops.py .. 259s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_reduction.py .............................. 259s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_repr.py . 259s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_algos.py .............. 260s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_analytics.py ........x..x................................................ 260s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_api.py ................................................................. 260s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_astype.py ...................... 260s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_constructors.py ....................................................................................................................... 260s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_dtypes.py .................................. 260s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_indexing.py ............................................................................................................................................... 260s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_map.py ............................. 260s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_missing.py .......................... 260s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_operators.py ...................................... 260s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_replace.py ...................... 260s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_repr.py ....................... 260s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_sorting.py .... 260s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_subclass.py ... 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_take.py ................ 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_warnings.py s 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/datetimes/test_constructors.py ......................ssssssssssssss 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/datetimes/test_cumulative.py ... 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/datetimes/test_reductions.py ................................................................................................................................................................................................................. 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_arithmetic.py .............................................................. 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_astype.py ......... 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_comparison.py .................................................................................................................. 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_concat.py ... 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_construction.py ............................... 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_contains.py . 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_function.py .................................................... 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_repr.py ........ 261s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_to_numpy.py .............................. 262s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_arithmetic.py ................................................................................................................................................................................................................................................................................................................................................... 262s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_comparison.py ......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 262s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_concat.py .................. 263s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_construction.py ............................................... 263s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_dtypes.py ......................................................................................................................... 263s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_function.py ............................................................................................................................. 263s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_indexing.py .. 263s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_reduction.py ............................................... 263s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_repr.py .......................... 263s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/interval/test_astype.py .. 263s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/interval/test_formats.py . 263s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/interval/test_interval.py .............................................................................. 263s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/interval/test_interval_pyarrow.py ssssssss 263s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/interval/test_overlaps.py .................................................................................................................... 265s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/masked/test_arithmetic.py ..............................................................................................................................................ss........................................................................................................................................................ss........................................................................................................................................................ss........................................................................................................................................................ss........................................................................................................................................................ss............................................................................................................................................................................................................................................................................................................................................................... 265s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/masked/test_function.py ..................... 265s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/masked/test_indexing.py ........................................................................................................... 265s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/numpy_/test_indexing.py ....................................... 265s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/numpy_/test_numpy.py ....................................................................................... 265s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/period/test_astype.py .......... 265s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/period/test_constructors.py ..................... 265s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/period/test_reductions.py ... 265s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_accessor.py ............................................. 267s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_arithmetics.py ...................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 267s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_array.py ........................................................................... 267s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_astype.py ........................ 267s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_combine_concat.py .......... 267s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_constructors.py ................................. 267s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_dtype.py ........................................................ 267s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_indexing.py ................................................................................ 268s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_libsparse.py ..................................................................................... 268s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_reductions.py ....................................................................... 268s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_unary.py ......... 269s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/string_/test_string.py .ss.ss.ss.ss.ss.ss.ss.ss.ss.ssxssxss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.......ssss.ss.ss.ss.ss.ss.ss.ssxxssssxxssss....ssssssss....ssssssss.sssssssssssssssssssssss.ss.ss..ssss.ss...ssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss 269s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/string_/test_string_arrow.py s.ss.ssssssssss.ssssssssssssssssssssssssssssssssss 269s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/test_array.py ....................................................................................... 284s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/test_datetimelike.py ........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ssssss....................................................................................................................................................................................................ssssssssssssssssssssssssssssssssssss.......................................................................................................................................................................................................................................................................................................................................... 288s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/test_datetimes.py ........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 288s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/test_ndarray_backed.py ..... 288s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/test_period.py ................... 288s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/test_timedeltas.py ...................................................................................................................................................... 288s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/timedeltas/test_constructors.py ........ 288s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/timedeltas/test_cumulative.py ..... 288s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/timedeltas/test_reductions.py .......................... 288s 288s =============================== warnings summary =============================== 288s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 288s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-wpo34c1l' 288s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 288s 288s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 288s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-8nbsbqp3' 288s session.config.cache.set(STEPWISE_CACHE_DIR, []) 288s 288s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 288s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 288s ============================= slowest 30 durations ============================= 288s 0.30s call tests/arrays/test_datetimes.py::TestNonNano::test_add_timedeltalike_scalar_mismatched_reso[ms-datetime.timezone(datetime.timedelta(seconds=3600))-scalar3] 288s 0.20s setup tests/arrays/test_datetimelike.py::TestDatetimeArray::test_iter_2d[zoneinfo.ZoneInfo(key='US/Pacific')-B] 288s 0.15s call tests/arrays/sparse/test_libsparse.py::TestSparseIndexUnion::test_index_make_union[xloc5-xlen5-yloc5-ylen5-eloc5-elen5] 288s 0.10s teardown tests/arrays/timedeltas/test_reductions.py::TestReductions::test_mean_2d 288s 0.05s call tests/arrays/sparse/test_accessor.py::TestSeriesAccessor::test_from_coo 288s 0.03s call tests/arrays/categorical/test_dtypes.py::TestCategoricalDtypes::test_codes_dtypes 288s 0.02s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_mixed_array_comparison[block] 288s 0.02s call tests/arrays/integer/test_arithmetic.py::test_values_multiplying_large_series_by_NA 288s 0.02s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_mixed_array_comparison[integer] 288s 0.02s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_float_scalar_comparison[block] 288s 0.02s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_int_array_comparison[integer] 288s 0.01s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_float_array_comparison[integer] 288s 0.01s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_float_scalar_comparison[integer] 288s 0.01s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_int_array_comparison[block] 288s 0.01s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_float_array_comparison[block] 288s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[float64-labels1-csc] 288s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[float64-labels1-csr] 288s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[float64-None-coo] 288s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[float64-labels1-coo] 288s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[float64-None-csc] 288s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[int64-labels1-coo] 288s 0.01s call tests/arrays/test_datetimelike.py::TestPeriodArray::test_median[D] 288s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[float64-None-csr] 288s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[int64-labels1-csr] 288s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[int64-labels1-csc] 288s 0.01s call tests/arrays/test_period.py::test_repr_large 288s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[int64-None-coo] 288s 0.01s call tests/arrays/test_datetimelike.py::TestPeriodArray::test_median[YE] 288s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[int64-None-csr] 288s 0.01s call tests/arrays/test_datetimelike.py::TestPeriodArray::test_median[B] 288s ========= 18203 passed, 1021 skipped, 8 xfailed, 2 warnings in 31.45s ========== 290s + echo 'rdjoqkol test state = true' 290s rdjoqkol test state = true 290s + for TEST_SUBSET in $modpath/tests/* 290s + echo /usr/lib/python3/dist-packages/pandas/tests/base 290s + grep -q -e __pycache__ 290s + PANDAS_CI=1 290s + LC_ALL=C.UTF-8 290s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/base 292s ============================= test session starts ============================== 292s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 292s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 292s rootdir: /usr/lib/python3/dist-packages/pandas 292s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 292s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 292s asyncio: mode=Mode.STRICT 292s collected 1775 items 292s 292s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_constructors.py ....................... 292s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_conversion.py ................................................................................................................................................................................................................................................................................................................................... 292s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_fillna.py ..................................................................................ssssssssssssssssss....ssss........ssssssssss......ss..............................................ss......................ssssssssssss 293s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_misc.py .......................................................................................................................................................................................................................................xx...xxx....................................................................s...... 293s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_transpose.py .............................................................................................................................................................................................................................. 293s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_unique.py ..................................................................................ssssssssssssssssss....ssss........ssssssssss......ss..............................................ss......................ssssssssssss..................................................................................ssssssssssssssssss....ssss..........ssssssss......ss......................................................................ssssssssssss.... 294s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_value_counts.py ..................................................................................ssssssssssssssssss....ssss........ssssssssss......ss..............................................ss......................ssssssssssss......................... 294s 294s =============================== warnings summary =============================== 294s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 294s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-3kcl9ojo' 294s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 294s 294s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 294s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-_yqp7ui5' 294s session.config.cache.set(STEPWISE_CACHE_DIR, []) 294s 294s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 294s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 294s ============================= slowest 30 durations ============================= 294s 0.04s setup tests/base/test_misc.py::test_searchsorted[uint64] 294s 0.02s call tests/base/test_value_counts.py::test_value_counts_null[interval-None] 294s 0.02s call tests/base/test_value_counts.py::test_value_counts_null[interval-nan] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts[interval] 294s 0.01s call tests/base/test_unique.py::test_unique_null[period-None] 294s 0.01s call tests/base/test_unique.py::test_unique_null[period-nan] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[datetime-tz-nan] 294s 0.01s call tests/base/test_unique.py::test_unique[period] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[datetime-tz-None] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts_bins[index] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[period-nan] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[period-None] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts[datetime-tz] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts_bins[series] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts[period] 294s 0.01s call tests/base/test_unique.py::test_unique[datetime-tz] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[timedelta-nan] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[timedelta-None] 294s 0.01s call tests/base/test_unique.py::test_unique_null[datetime-tz-None] 294s 0.01s call tests/base/test_unique.py::test_unique_null[datetime-tz-nan] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts[timedelta] 294s 0.01s call tests/base/test_unique.py::test_unique_null[timedelta-None] 294s 0.01s call tests/base/test_unique.py::test_unique[timedelta] 294s 0.01s call tests/base/test_unique.py::test_unique_null[timedelta-nan] 294s 0.01s call tests/base/test_unique.py::test_unique_null[datetime-nan] 294s 0.01s call tests/base/test_unique.py::test_unique_null[datetime-None] 294s 0.01s teardown tests/base/test_value_counts.py::test_value_counts_object_inference_deprecated 294s 0.01s call tests/base/test_unique.py::test_unique[datetime] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[datetime-nan] 294s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[datetime-None] 294s =========== 1581 passed, 189 skipped, 5 xfailed, 2 warnings in 2.90s =========== 294s + echo 'rdjoqkol test state = true' 294s + for TEST_SUBSET in $modpath/tests/* 294s + echo /usr/lib/python3/dist-packages/pandas/tests/computation 294s + grep -q -e __pycache__ 294s + PANDAS_CI=1 294s + LC_ALL=C.UTF-8 294s rdjoqkol test state = true 294s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/computation 297s ============================= test session starts ============================== 297s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 297s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 297s rootdir: /usr/lib/python3/dist-packages/pandas 297s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 297s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 297s asyncio: mode=Mode.STRICT 297s collected 11159 items 297s 297s ../../../usr/lib/python3/dist-packages/pandas/tests/computation/test_compat.py ..... 338s ../../../usr/lib/python3/dist-packages/pandas/tests/computation/test_eval.py ..............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................xx..............................xx..............................xx..............................xx.......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................xxxxxxxxxx..................................................xxxxxxxxxx......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................X.........X....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x....................................................................................................................................................................................................................................xx..xx..... 338s 338s =============================== warnings summary =============================== 338s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 338s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-nw76meiu' 338s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 338s 338s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 338s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-226ft9ox' 338s session.config.cache.set(STEPWISE_CACHE_DIR, []) 338s 338s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 338s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 338s ============================= slowest 30 durations ============================= 338s 0.17s call tests/computation/test_eval.py::TestEval::test_floor_division[SeriesNaN-Series-python-python] 338s 0.08s call tests/computation/test_eval.py::TestEval::test_complex_cmp_ops[SeriesNaN-Series-python-pandas-|-gt-ne] 338s 0.05s call tests/computation/test_eval.py::TestAlignment::test_performance_warning_for_poor_alignment[numexpr-python] 338s 0.05s call tests/computation/test_eval.py::TestAlignment::test_performance_warning_for_poor_alignment[numexpr-pandas] 338s 0.04s teardown tests/computation/test_eval.py::TestValidate::test_validate_bool_args[5.0] 338s 0.03s call tests/computation/test_eval.py::TestAlignment::test_performance_warning_for_poor_alignment[python-python] 338s 0.03s call tests/computation/test_eval.py::TestAlignment::test_performance_warning_for_poor_alignment[python-pandas] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_simple_arith_ops[numexpr-python] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_check_many_exprs[numexpr-python] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_lhs_expression_subscript 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_attr_expression 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_nested_period_index_subscript_expression 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_check_many_exprs[numexpr-pandas] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[numexpr-pandas-&] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_check_many_exprs[python-pandas] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[numexpr-pandas-|] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[numexpr-python-|] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_check_many_exprs[python-python] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_simple_arith_ops[numexpr-pandas] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[python-pandas-|] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[numexpr-python-&] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[python-pandas-&] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_multi_line_expression 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[python-python-&] 338s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[python-python-|] 338s 0.01s call tests/computation/test_eval.py::TestAlignment::test_medium_complex_frame_alignment[python-pandas-dt-dt-i-i] 338s 0.01s call tests/computation/test_eval.py::TestAlignment::test_medium_complex_frame_alignment[python-python-dt-dt-i-i] 338s 0.01s call tests/computation/test_eval.py::TestAlignment::test_medium_complex_frame_alignment[python-pandas-dt-dt-s-i] 338s 0.01s call tests/computation/test_eval.py::TestAlignment::test_medium_complex_frame_alignment[numexpr-pandas-dt-dt-s-i] 338s 0.01s call tests/computation/test_eval.py::TestAlignment::test_medium_complex_frame_alignment[python-python-dt-dt-s-i] 338s ========== 11124 passed, 33 xfailed, 2 xpassed, 2 warnings in 41.87s =========== 339s + echo 'rdjoqkol test state = true' 339s rdjoqkol test state = true 339s + for TEST_SUBSET in $modpath/tests/* 339s + echo /usr/lib/python3/dist-packages/pandas/tests/config 339s + grep -q -e __pycache__ 339s + PANDAS_CI=1 339s + LC_ALL=C.UTF-8 339s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/config 340s ============================= test session starts ============================== 340s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 340s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 340s rootdir: /usr/lib/python3/dist-packages/pandas 340s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 340s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 340s asyncio: mode=Mode.STRICT 340s collected 50 items 340s 341s ../../../usr/lib/python3/dist-packages/pandas/tests/config/test_config.py ..................... 341s ../../../usr/lib/python3/dist-packages/pandas/tests/config/test_localization.py ............................. 341s 341s =============================== warnings summary =============================== 341s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 341s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-pnufu34_' 341s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 341s 341s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 341s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-1u6wft3h' 341s session.config.cache.set(STEPWISE_CACHE_DIR, []) 341s 341s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 341s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 341s ============================= slowest 30 durations ============================= 341s 0.01s call tests/config/test_localization.py::test_get_locales_prefix 341s 341s (29 durations < 0.005s hidden. Use -vv to show these durations.) 341s ======================== 50 passed, 2 warnings in 0.47s ======================== 341s + echo 'rdjoqkol test state = true' 341s + for TEST_SUBSET in $modpath/tests/* 341s rdjoqkol test state = true 341s + echo /usr/lib/python3/dist-packages/pandas/tests/construction 341s + grep -q -e __pycache__ 341s + PANDAS_CI=1 341s + LC_ALL=C.UTF-8 341s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/construction 342s ============================= test session starts ============================== 342s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 342s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 342s rootdir: /usr/lib/python3/dist-packages/pandas 342s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 342s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 342s asyncio: mode=Mode.STRICT 342s collected 1 item 342s 342s ../../../usr/lib/python3/dist-packages/pandas/tests/construction/test_extract_array.py . 342s 342s =============================== warnings summary =============================== 342s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 342s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-5odn82nj' 342s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 342s 342s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 342s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-h6qpmnx0' 342s session.config.cache.set(STEPWISE_CACHE_DIR, []) 342s 342s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 342s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 342s ============================= slowest 30 durations ============================= 342s 342s (3 durations < 0.005s hidden. Use -vv to show these durations.) 342s ======================== 1 passed, 2 warnings in 0.08s ========================= 342s + echo 'rdjoqkol test state = true' 342s + for TEST_SUBSET in $modpath/tests/* 342s + echo /usr/lib/python3/dist-packages/pandas/tests/copy_view 342s rdjoqkol test state = true 342s + grep -q -e __pycache__ 342s + PANDAS_CI=1 342s + LC_ALL=C.UTF-8 342s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/copy_view 345s ============================= test session starts ============================== 345s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 345s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 345s rootdir: /usr/lib/python3/dist-packages/pandas 345s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 345s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 345s asyncio: mode=Mode.STRICT 345s collected 793 items 345s 345s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/index/test_datetimeindex.py ...... 345s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/index/test_index.py ..................... 345s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/index/test_periodindex.py .. 345s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/index/test_timedeltaindex.py .. 345s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_array.py ............. 345s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_astype.py .....ss...s..........s.. 345s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_chained_assignment_deprecation.py ............... 345s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_clip.py ...... 345s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_constructors.py ............................................................................ 345s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_core_functionalities.py ....... 345s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_functions.py .................... 346s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_indexing.py ....................................................................................s.....s........................................................................................................................................ 346s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_internals.py ..................... 346s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_interp_fillna.py ...................................................... 346s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_methods.py ........................................................................................................................................................................................................................................................ 346s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_replace.py ........................................ 346s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_setitem.py ......... 346s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_util.py .. 346s 346s =============================== warnings summary =============================== 346s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 346s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-kpqw1xza' 346s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 346s 346s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 346s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-sp14k360' 346s session.config.cache.set(STEPWISE_CACHE_DIR, []) 346s 346s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 346s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 346s ============================= slowest 30 durations ============================= 346s 0.01s call tests/copy_view/test_internals.py::test_exponential_backoff 346s 346s (29 durations < 0.005s hidden. Use -vv to show these durations.) 346s ================== 787 passed, 6 skipped, 2 warnings in 2.20s ================== 346s + echo 'rdjoqkol test state = true' 346s + for TEST_SUBSET in $modpath/tests/* 346s rdjoqkol test state = true 346s + echo /usr/lib/python3/dist-packages/pandas/tests/dtypes 346s + grep -q -e __pycache__ 346s + PANDAS_CI=1 346s + LC_ALL=C.UTF-8 346s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/dtypes 349s ============================= test session starts ============================== 349s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 349s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 349s rootdir: /usr/lib/python3/dist-packages/pandas 349s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 349s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 349s asyncio: mode=Mode.STRICT 349s collected 5628 items 349s 349s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_can_hold_element.py ........... 349s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_construct_from_scalar.py .... 349s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_construct_ndarray.py ....... 349s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_construct_object_arr.py ....................................... 349s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_dict_compat.py . 349s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_downcast.py ................................... 349s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_find_common_type.py .......................................................................................... 349s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_infer_datetimelike.py ... 349s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_infer_dtype.py .................................................................... 349s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_maybe_box_native.py ................ 352s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_promote.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. 353s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/test_common.py .............................................................................................................................................................................s...................................................................................................................................................................................................................................................................... 353s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/test_concat.py .... 353s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/test_dtypes.py ........................................................................................................................................................................................................................................................................................... 353s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/test_generic.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 354s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/test_inference.py .......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ssssssss....................................................... 355s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/test_missing.py ..........................................................................................................xxxx............................................................................................................................................................................................................. 355s 355s =============================== warnings summary =============================== 355s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 355s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-hw53c56h' 355s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 355s 355s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 355s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-fpmxlprw' 355s session.config.cache.set(STEPWISE_CACHE_DIR, []) 355s 355s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 355s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 355s ============================= slowest 30 durations ============================= 355s 0.07s call tests/dtypes/test_inference.py::TestInference::test_maybe_convert_objects_mixed_datetimes 355s 0.04s call tests/dtypes/test_common.py::test_is_sparse[True] 355s 0.02s teardown tests/dtypes/test_missing.py::TestIsValidNAForDtype::test_is_valid_na_for_dtype_categorical 355s 355s (27 durations < 0.005s hidden. Use -vv to show these durations.) 355s ============ 5615 passed, 9 skipped, 4 xfailed, 2 warnings in 6.65s ============ 355s + echo 'rdjoqkol test state = true' 355s + for TEST_SUBSET in $modpath/tests/* 355s rdjoqkol test state = true 355s + echo /usr/lib/python3/dist-packages/pandas/tests/extension 355s + grep -q -e __pycache__ 355s + PANDAS_CI=1 355s + LC_ALL=C.UTF-8 355s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/extension 359s ============================= test session starts ============================== 359s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 359s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 359s rootdir: /usr/lib/python3/dist-packages/pandas 359s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 359s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 359s asyncio: mode=Mode.STRICT 359s collected 16808 items / 1 skipped 359s 359s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/array_with_attr/test_array_with_attr.py . 362s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/decimal/test_decimal.py ssssssssssssssssssssssssssssssssssss....................................x.................................................................................................................................................................................................................................................................................................................................xx............................................................s............................xxxxxxxxss..............xxssxxss................................................xxx..................... 373s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/json/test_json.py ssssssssssssssssssssssssssssssssssss.................................................................................ssssssssssssssssssssssss.............................................s.......................................................s............................................ss................................................................................xx.......................................................s............xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx........xxxx........xxxxxxs.xxxxxxxxxxxxxxxxxxxxx....xxxxxxx.xx.xxxxxx...xxxxx...x...xxxxxxxxxxxx. 373s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/list/test_list.py . 375s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_categorical.py ssssssssssssssssssssssssssssssssssss....................................x..................................................................................................ssssssssssssssssssssssss...................................s.........................................................................................................................ss.................................................................................................xx...........................................................s............x..sxx................x.............xxxxxssssssssssssssssssssssssssssssssssss.. 375s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_common.py ............ 376s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_datetime.py ........................ssss............................................x..............................................................................................ssssssssssssssssssssssss.................................................................s......................................................................................................................................................................................xx..............................................xx...........................................................s............................................ssss......... 376s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_extension.py .............. 378s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_interval.py ssssssssssssssssssssssssssssssssssss....................................x..................................................................................................ssssssssssssssssssssssss.................................................................s...............................................................................................................................ss......................................................xx..............................................xx............................................................s............x. 393s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_masked.py ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss..................................................................................................................................................................................................................................................................................................................................................................................................x...x...x...x...x...x...x...x...x...x...x..............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ssssssssssssssssssss..ssssssssssssssssssss..........................................................................................................................................................................................................................................................................ssssssssssssssssssssss................................................................................................................................................................................ssssssssssssssssssssss......................ssssssssssssssssssssss..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................xxxxxxxxxxxxxxxxxxxxxx..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s.s.s.s.s.s.s.s.s.s.s..................................................................................s............................................................................................x...........sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 398s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_numpy.py ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................xxxx...............................................................................................................s.s........................ss.x.x...x.x.x.x.x.xxx.x.x.............x.x.....................................................x.x................ssssssssssssssssssssssssssssssssssssssssssssssss.x.x.x.x.x..xx.x..xx..xx...xxx...xxx.x...x...x.x.x..............xx..xxssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.. 400s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_period.py ..........................................ssss........ssss.....................................................................................x...x..................................................................................................................................................................................................ssssssssssssssssssssssssssssssssssssssssssssssss..................................................................................................................................ss........................................................................................................................................................................................................................................................................................................................................................................................................................................................................xxxx.......................................................................................................................s.s............................................................................ssss........ssss................ 410s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_sparse.py ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss..ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss....................................ssss................................................................................................................................................................................................................................................................................................................................................................................................xxxx........................ssss..............................................................xxxx..........................................................................................................................................xxxxxxxxxxxxxx...x.x.x.xxxxxxxxxxxxxxxss........xxxxxxxxssxxssss.x.x.x.xxxxxxxxxssxxss..........s.s.s.s.s.s...s......xxx......xx........ss..ssssss................xsxsssssssssssssss..............ssssssssssssss..............xxxx....xx.x............................x..x.x......xx.xxx......xxxxxx. 414s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_string.py ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss...ssssss...ssssss...ssssss...ssssss.ss.ss.ss.ss.ss.ss.ss.ss...ssssss...ssssss...ssssss...ssssss.x..ssssssss.x..ssssssss....ssssssss....ssssssss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss....ssssssss....ssssssss........ssssssssssssssss........ssssssssssssssss.ss.ss.ss.ss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss...ssssss...ssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssssssss.ss.ss.......................................ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.......................................ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss........ssssssssssssssss........ssssssssssssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss....ssssssss....ssssssss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss..ssss..ssss........ssssssssssssssss........ssssssssssssssss.ss.ss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss...ssssss...ssssss....ssssssss....ssssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss......ssssssssssss......ssssssssssss.ss.ssssssssssssss.....ssssssssss.....ssssssssss.....ssssssssss.....ssssssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss.ss.ss.ss.ss.ss.ss.ss.ss...ssssss...ssssss...ssssss...ssssss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss...ssssss...ssssss..ssss..ssssxxssssxxssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss..ssss..ssss.ss.ss.ss.ss.sssss.sssss.ss.ss.ss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.ss.ss.ss.ss 414s 414s =============================== warnings summary =============================== 414s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 414s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-xymcqryj' 414s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 414s 414s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 414s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-ptp2tagd' 414s session.config.cache.set(STEPWISE_CACHE_DIR, []) 414s 414s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 414s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 414s ============================= slowest 30 durations ============================= 414s 0.20s setup tests/extension/test_numpy.py::TestNumpyExtensionArray::test_reductions_2d_axis_none[object-std] 414s 0.18s call tests/extension/test_masked.py::TestMaskedArrays::test_EA_types[Float32Dtype-c] 414s 0.12s setup tests/extension/test_masked.py::TestMaskedArrays::test_reshape[Int16Dtype] 414s 0.09s call tests/extension/json/test_json.py::TestJSONArray::test_hash_pandas_object_works[True] 414s 0.06s call tests/extension/test_sparse.py::TestSparseArray::test_unstack[0-series-index2] 414s 0.06s teardown tests/extension/test_string.py::test_searchsorted_with_na_raises[False-False-pyarrow_numpy] 414s 0.05s call tests/extension/test_sparse.py::TestSparseArray::test_unstack[nan-series-index2] 414s 0.05s call tests/extension/test_sparse.py::TestSparseArray::test_unstack[0-series-index3] 414s 0.05s call tests/extension/test_sparse.py::TestSparseArray::test_unstack[nan-series-index3] 414s 0.04s call tests/extension/decimal/test_decimal.py::TestDecimalArray::test_arith_series_with_array[__rpow__] 414s 0.04s call tests/extension/test_interval.py::TestIntervalArray::test_unstack[series-index2] 414s 0.04s call tests/extension/test_string.py::TestStringArray::test_unstack[False-python-series-index2] 414s 0.04s call tests/extension/test_interval.py::TestIntervalArray::test_unstack[frame-index2] 414s 0.04s call tests/extension/test_string.py::TestStringArray::test_unstack[True-python-series-index2] 414s 0.04s call tests/extension/test_sparse.py::TestSparseArray::test_unstack[nan-frame-index2] 414s 0.04s call tests/extension/test_sparse.py::TestSparseArray::test_unstack[0-frame-index2] 414s 0.04s call tests/extension/decimal/test_decimal.py::TestDecimalArray::test_unstack[series-index2] 414s 0.04s call tests/extension/test_interval.py::TestIntervalArray::test_unstack[series-index3] 414s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[Float32Dtype-series-index2] 414s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[UInt64Dtype-series-index2] 414s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[Float64Dtype-series-index2] 414s 0.04s call tests/extension/test_numpy.py::TestNumpyExtensionArray::test_unstack[float-series-index2] 414s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[UInt32Dtype-series-index2] 414s 0.04s call tests/extension/test_interval.py::TestIntervalArray::test_unstack[frame-index3] 414s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[Int64Dtype-series-index2] 414s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[Int8Dtype-series-index2] 414s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[UInt8Dtype-series-index2] 414s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[BooleanDtype-series-index2] 414s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[Int16Dtype-series-index2] 414s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[UInt16Dtype-series-index2] 414s ======== 12106 passed, 4371 skipped, 332 xfailed, 2 warnings in 57.33s ========= 415s + echo 'rdjoqkol test state = true' 415s + for TEST_SUBSET in $modpath/tests/* 415s rdjoqkol test state = true 415s + echo /usr/lib/python3/dist-packages/pandas/tests/frame 415s + grep -q -e __pycache__ 415s + PANDAS_CI=1 415s + LC_ALL=C.UTF-8 415s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/frame 420s ============================= test session starts ============================== 420s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 420s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 420s rootdir: /usr/lib/python3/dist-packages/pandas 420s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 420s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 420s asyncio: mode=Mode.STRICT 420s collected 11172 items / 433 deselected / 1 skipped / 10739 selected 420s 420s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/constructors/test_from_dict.py .............. 420s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/constructors/test_from_records.py ........................... 420s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_coercion.py .......x.x. 420s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_delitem.py .... 420s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_get.py .... 420s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_get_value.py .. 420s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_getitem.py ........................................ 422s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_indexing.py ...................................................................................................................................................................................................................................sss................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. 422s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_insert.py ....... 422s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_mask.py ........... 422s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_set_value.py ... 423s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_setitem.py ........................................................................................................s..........................................................................................xxx...........................x..x..x..x........ 423s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_take.py .... 425s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_where.py ............................................................................................................................................. 425s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_xs.py .............................. 425s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_add_prefix_suffix.py ... 426s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_align.py ........................................................................................................................................................................................................................................................................................................................................................................................................................ 426s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_asfreq.py ........................................ 426s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_asof.py ........... 426s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_assign.py ..... 426s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_astype.py .......................................................................................................................s....s........................................................................................................ss.....ssssssssss 427s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_at_time.py ...................... 427s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_between_time.py ss............................ 427s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_clip.py ..................... 427s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_combine.py ..... 427s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_combine_first.py ..................................s............................... 427s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_compare.py ......................... 427s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_convert_dtypes.py ..ssss..ssss.sss.. 427s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_copy.py ..... 427s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_count.py .. 428s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_cov_corr.py .............................................................s............ 428s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_describe.py ...............................................s 428s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_diff.py .............................................. 428s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_dot.py ................sss 428s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_drop.py ......................................................................... 428s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_drop_duplicates.py ..................................... 428s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_droplevel.py .. 428s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_dropna.py ................... 428s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_dtypes.py ........ 429s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_duplicated.py .......xxx........... 429s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_equals.py ... 429s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_explode.py ..................... 429s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_fillna.py ................................................................. 429s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_filter.py ........... 429s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_first_and_last.py ............. 429s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_first_valid_index.py ............... 429s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_get_numeric_data.py .... 430s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_head_tail.py .................................................................. 430s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_infer_objects.py . 431s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_info.py ..........................x......s...... 431s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_interpolate.py ...................................................................ssssssssss 431s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_is_homogeneous_dtype.py ....... 431s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_isetitem.py ... 431s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_isin.py ................. 431s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_iterrows.py . 431s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_join.py ...........s................... 431s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_map.py ......................... 431s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_matmul.py .. 432s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_nlargest.py .........................................................................................................................................................................................................................................................X.....X.....X.....X.......X..... 432s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_pct_change.py ............................. 432s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_pipe.py ...... 432s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_pop.py ... 434s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_quantile.py ........................................................................xx..........xx..........xx..........xx.................. 435s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_rank.py .........................................................................................................................ss 436s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_reindex.py ................................................................................................................................................. 436s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_reindex_like.py ..... 436s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_rename.py ......................... 436s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_rename_axis.py ......... 436s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_reorder_levels.py ... 436s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_replace.py ...................ss.................ssss....ssss................................................................................................................................................................. 437s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_reset_index.py ................................................................................................................................ 437s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_round.py ......... 437s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_sample.py .......................................................... 437s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_select_dtypes.py ..................................s...... 437s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_set_axis.py .............. 439s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_set_index.py ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. 441s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_shift.py ...................................................................................x.x.x.x.x.xxxx........ 441s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_size.py ..... 442s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_sort_index.py ................................................................. 442s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_sort_values.py ...................................................X...............................XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX..... 442s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_swapaxes.py .... 442s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_swaplevel.py . 443s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_csv.py ............................................................................. 443s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_dict.py ...................................................................................................... 443s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_dict_of_blocks.py ... 443s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_numpy.py .... 444s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_period.py ...................................................................... 444s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_records.py ................................... 444s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_timestamp.py ...................................................................... 444s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_transpose.py ................... 444s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_truncate.py ........................................................................................ 444s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_tz_convert.py ........... 444s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_tz_localize.py ......... 444s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_update.py .............. 444s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_value_counts.py ................................. 444s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_values.py ............... 444s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_alter_axes.py .. 445s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_api.py ......................ss.......... 447s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_arithmetic.py ..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x..........x................................................................................................................................ 447s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_block_internals.py ................... 449s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_constructors.py ..................................................................................................................................................................................................s.....................................................................................................................................................................................................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..........sss...ss.......................................................s..........................................xxxx..xx..........xxxx..xx................................ 449s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_cumulative.py ....... 449s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_iteration.py .................... 449s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_logical_ops.py ................. 449s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_nonunique_indexes.py ................ 449s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_npfuncs.py .... 451s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_query_eval.py ..................ss..ss..ss.......................s.......................s..s......sssss.................................................s........................s..s.....sssss...............................................ss..ss......................ss.............................................................................s....ss. 454s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_reductions.py ....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s....................................................................................................x.............x............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. 454s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_repr.py ..........................................ssss................ 460s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_stack_unstack.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 460s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_subclass.py .................................................... 461s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_ufunc.py ....xx.........xxxxxxxx.xx....s. 461s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_unary.py .................. 462s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_validate.py ............................ 462s 462s =============================== warnings summary =============================== 462s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 462s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-hysqkrut' 462s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 462s 462s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 462s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-pt0zxv0n' 462s session.config.cache.set(STEPWISE_CACHE_DIR, []) 462s 462s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 462s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 462s ============================= slowest 30 durations ============================= 462s 1.61s call tests/frame/indexing/test_where.py::test_where_inplace_casting 462s 1.15s call tests/frame/methods/test_rank.py::TestRank::test_pct_max_many_rows 462s 0.70s call tests/frame/test_api.py::TestDataFrameMisc::test_inspect_getmembers 462s 0.37s call tests/frame/methods/test_info.py::test_info_verbose_with_counts_spacing[10001- # Column Non-Null Count Dtype ---- ------ -------------- ----- - 0 0 3 non-null float64- 10000 10000 3 non-null float64] 462s 0.35s call tests/frame/test_stack_unstack.py::TestStackUnstackMultiLevel::test_stack[True] 462s 0.25s call tests/frame/methods/test_cov_corr.py::TestDataFrameCorr::test_corr_scipy_method[kendall] 462s 0.22s call tests/frame/methods/test_to_csv.py::TestDataFrameToCSV::test_to_csv_with_dst_transitions_with_pickle 462s 0.21s call tests/frame/methods/test_to_csv.py::TestDataFrameToCSV::test_to_csv_chunking[10000] 462s 0.20s call tests/frame/methods/test_to_csv.py::TestDataFrameToCSV::test_to_csv_chunking[100000] 462s 0.20s call tests/frame/methods/test_to_csv.py::TestDataFrameToCSV::test_to_csv_chunking[50000] 462s 0.16s call tests/frame/methods/test_cov_corr.py::TestDataFrameCorrWith::test_corrwith[Float64] 462s 0.15s call tests/frame/methods/test_sample.py::TestSample::test_sample_generator[DataFrame] 462s 0.13s call tests/frame/test_stack_unstack.py::TestStackUnstackMultiLevel::test_stack[False] 462s 0.11s call tests/frame/test_stack_unstack.py::TestStackUnstackMultiLevel::test_stack_order_with_unsorted_levels_multi_row[False] 462s 0.10s call tests/frame/methods/test_at_time.py::TestAtTime::test_at_time_axis[index] 462s 0.10s call tests/frame/methods/test_at_time.py::TestAtTime::test_at_time_axis[0] 462s 0.09s call tests/frame/test_stack_unstack.py::TestStackUnstackMultiLevel::test_stack_unstack_multiple[False] 462s 0.09s call tests/frame/test_repr.py::TestDataFrameRepr::test_repr_to_string 462s 0.08s call tests/frame/test_stack_unstack.py::TestStackUnstackMultiLevel::test_stack_unstack_multiple[True] 462s 0.08s call tests/frame/methods/test_to_csv.py::TestDataFrameToCSV::test_to_csv_dups_cols 462s 0.08s call tests/frame/test_repr.py::TestDataFrameRepr::test_repr_bytes_61_lines 462s 0.08s call tests/frame/test_stack_unstack.py::TestStackUnstackMultiLevel::test_stack_order_with_unsorted_levels_multi_row[True] 462s 0.07s call tests/frame/methods/test_interpolate.py::TestDataFrameInterpolate::test_interp_string_axis[index-0] 462s 0.07s call tests/frame/test_block_internals.py::TestDataFrameBlockInternals::test_strange_column_corruption_issue 462s 0.06s call tests/frame/methods/test_interpolate.py::TestDataFrameInterpolate::test_interp_string_axis[columns-1] 462s 0.05s call tests/frame/test_stack_unstack.py::TestDataFrameReshape::test_stack_int_level_names[False] 462s 0.05s call tests/frame/test_query_eval.py::TestDataFrameQueryNumExprPandas::test_nested_scope 462s 0.05s call tests/frame/methods/test_duplicated.py::test_duplicated_implemented_no_recursion 462s 0.05s call tests/frame/test_iteration.py::TestIteration::test_itertuples_py2_3_field_limit_namedtuple[False-1024] 462s 0.05s call tests/frame/methods/test_head_tail.py::test_head_tail_generic[timedelta-DataFrame] 462s = 10435 passed, 209 skipped, 433 deselected, 58 xfailed, 38 xpassed, 2 warnings in 44.83s = 463s rdjoqkol test state = true 463s + echo 'rdjoqkol test state = true' 463s + for TEST_SUBSET in $modpath/tests/* 463s + echo /usr/lib/python3/dist-packages/pandas/tests/generic 463s + grep -q -e __pycache__ 463s + PANDAS_CI=1 463s + LC_ALL=C.UTF-8 463s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/generic 465s ============================= test session starts ============================== 465s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 465s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 465s rootdir: /usr/lib/python3/dist-packages/pandas 465s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 465s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 465s asyncio: mode=Mode.STRICT 465s collected 1249 items 465s 465s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_duplicate_labels.py ..........xx...........x.......xx.xxx................x................ 468s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_finalize.py ..........................x..................................x........x....................................................................................................................xs..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s..x..x.x...x.s..s...s..s..x..x.x...x.s..s...s..s..x..x.x...x.s..s...s..s..x..x.x...x.s..s...s..s..x..x.x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x........................................................................................xxxxxxxxx..........xxxxxxxxxxxx. 468s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_frame.py ............... 468s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_generic.py ................................................................................. 468s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_label_or_level_utils.py ....................................................................... 468s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_series.py ................... 469s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_to_xarray.py ......................s....................................... 469s 469s =============================== warnings summary =============================== 469s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 469s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-29nd5a7k' 469s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 469s 469s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 469s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-aua4fyh1' 469s session.config.cache.set(STEPWISE_CACHE_DIR, []) 469s 469s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 469s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 469s ============================= slowest 30 durations ============================= 469s 0.17s call tests/generic/test_to_xarray.py::TestSeriesToXArray::test_to_xarray_index_types[string] 469s 0.02s call tests/generic/test_generic.py::TestGeneric::test_truncate_out_of_bounds[DataFrame] 469s 0.01s call tests/generic/test_generic.py::TestGeneric::test_truncate_out_of_bounds[Series] 469s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[IntervalIndex] 469s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[MultiIndex] 469s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[CategoricalIndex] 469s 0.01s call tests/generic/test_finalize.py::test_finalize_called[pivot', columns='A] 469s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[TimedeltaIndex] 469s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[Index2] 469s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[DatetimeIndex] 469s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[PeriodIndex] 469s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[Index1] 469s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[Index0] 469s 0.01s call tests/generic/test_to_xarray.py::TestDataFrameToXArray::test_to_xarray_with_multiindex 469s 0.01s call tests/generic/test_finalize.py::test_groupby_finalize_not_implemented[5-obj1] 469s 0.01s call tests/generic/test_to_xarray.py::TestDataFrameToXArray::test_to_xarray_index_types[string] 469s 469s (14 durations < 0.005s hidden. Use -vv to show these durations.) 469s ========== 1006 passed, 105 skipped, 138 xfailed, 2 warnings in 4.34s ========== 469s + echo 'rdjoqkol test state = true' 469s rdjoqkol test state = true 469s + for TEST_SUBSET in $modpath/tests/* 469s + echo /usr/lib/python3/dist-packages/pandas/tests/groupby 469s + grep -q -e __pycache__ 469s + PANDAS_CI=1 469s + LC_ALL=C.UTF-8 469s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/groupby 473s ============================= test session starts ============================== 473s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 473s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 473s rootdir: /usr/lib/python3/dist-packages/pandas 473s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 473s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 473s asyncio: mode=Mode.STRICT 473s collected 29434 items / 1832 deselected / 1 skipped / 27602 selected 473s 474s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_aggregate.py ...............................................................................................................................................................................................................................................................................................................................................x..x.......................................................................x....... 475s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_cython.py ........................................................................................................................................................................ 475s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_numba.py sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 475s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_other.py ........................................ 475s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_corrwith.py . 475s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_describe.py ......................... 475s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_groupby_shift_diff.py ............................................... 475s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_is_monotonic.py ...... 475s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_nlargest_nsmallest.py ........................................... 476s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_nth.py .................................................................................................................................................................................................................................... 477s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_quantile.py ..................................................................................................x....x.......................................................................................................................................................................................................................... 479s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_rank.py ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 479s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_sample.py .............. 481s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_size.py .......x....x....x....x....x....x....x....x...............ss 481s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_skew.py . 482s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_value_counts.py ...........X......XXX...XXX.........XXX...XXX........................................ss.ss.ss.ss.ss.ss....XX..........................XXXXXXXXXXXXXXXX........XXXXXXXX...................................... 483s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_all_methods.py ......................................xx..............................................................................................ss..........ss..............ss...... 483s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_api.py ......s..s..............................s..s.......................... 483s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_apply.py .................................................................................................................................... 483s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_apply_mutate.py ..... 483s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_bin_groupby.py ...... 494s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_categorical.py .................................................................................................................................x......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ss.ss.ss.ssxxxxxx.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssxxxxxx.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssxxxxxx.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssxxxxxx.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s........s........s........s........s........s........s........s...ss.sssss.ss.sssss.ss.sssss.ss.sssss.ss.sssss.ss.sssss.ss.sssss.ss.sssss...................ss.ss.ss.ss.ss.ss.................sxsx................................................................................sxsx................................................................................sxsx................................................................................sxsx.......................................x................................ 494s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_counting.py .................................ssss 494s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_cumulative.py ..................................................... 495s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_filters.py ............................ 501s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_groupby.py ......................................s..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s...................s................................................................................................................... 518s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_groupby_dropna.py ..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss...........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................xxxxxxxx.................................................. 519s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_groupby_subclass.py .....s................................................................. 519s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_grouping.py ....................................................................................... 519s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_index_as_string.py .................. 519s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_indexing.py ............................................................................................................................................................................ 519s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_libgroupby.py ........................... 519s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_missing.py ......................... 521s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_numeric_only.py ..................xxx......ssssss..............................xxx.................................................................................................................................................................................................................................................................................................................................................................................. 521s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_pipe.py .. 542s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_raises.py ..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 543s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_reductions.py ...............................................................................................................................................................................................................................................................................................................ss........................................................................................................................................................ 543s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_timegrouper.py ..............................s 543s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/transform/test_numba.py sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 552s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/transform/test_transform.py ....................x.........................................................................................................................x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x....................................................................................x....................x............................................................................................................................................................................................................ 552s 552s =============================== warnings summary =============================== 552s tests/groupby/test_categorical.py::test_basic 552s /usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:86: FutureWarning: The behavior of DataFrame.sum with axis=None is deprecated, in a future version this will reduce over both axes and return a scalar. To retain the old behavior, pass axis=0 (or do not pass axis) 552s return reduction(axis=axis, out=out, **passkwargs) 552s 552s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 552s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-hg2zvgqw' 552s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 552s 552s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 552s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-_wyxtcfh' 552s session.config.cache.set(STEPWISE_CACHE_DIR, []) 552s 552s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 552s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 552s ============================= slowest 30 durations ============================= 552s 0.39s call tests/groupby/test_raises.py::test_groupby_raises_category_on_category[-False-last-False-agg] 552s 0.27s call tests/groupby/test_reductions.py::test_ops_general[sem-scipy_sem] 552s 0.21s call tests/groupby/test_groupby.py::test_empty_groupby[False-skew-apply-boolean-keys1-columns1] 552s 0.11s teardown tests/groupby/transform/test_transform.py::test_idxmin_idxmax_transform_args[False-False-idxmin] 552s 0.07s call tests/groupby/test_categorical.py::test_basic 552s 0.06s call tests/groupby/test_counting.py::test_count 552s 0.06s call tests/groupby/test_counting.py::TestCounting::test_ngroup_cumcount_pair 552s 0.05s call tests/groupby/test_timegrouper.py::TestGroupBy::test_timegrouper_with_reg_groups 552s 0.05s call tests/groupby/test_groupby_subclass.py::test_groupby_preserves_subclass[corrwith-obj0] 552s 0.04s call tests/groupby/test_categorical.py::test_datetime 552s 0.04s call tests/groupby/methods/test_describe.py::test_frame_describe_multikey 552s 0.04s call tests/groupby/test_apply.py::test_apply_concat_preserve_names 552s 0.03s call tests/groupby/test_groupby.py::test_groupby_multiindex_not_lexsorted 552s 0.03s call tests/groupby/transform/test_transform.py::test_as_index_no_change[corrwith-keys1] 552s 0.02s call tests/groupby/test_apply.py::test_apply_corner_cases 552s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-False-True-multi] 552s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-False-False-multi] 552s 0.02s call tests/groupby/test_categorical.py::test_observed[False] 552s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-True-True-True-multi] 552s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-True-False-True-multi] 552s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-True-True-multi] 552s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-True-False-multi] 552s 0.02s call tests/groupby/test_categorical.py::test_describe_categorical_columns 552s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-True-False-False-multi] 552s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-True-True-False-multi] 552s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-False-False-single] 552s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-False-True-single] 552s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-True-True-range] 552s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-False-False-range] 552s 0.02s call tests/groupby/test_grouping.py::TestGrouping::test_groupby_multiindex_partial_indexing_equivalence 552s = 26564 passed, 910 skipped, 1832 deselected, 90 xfailed, 39 xpassed, 3 warnings in 81.96s (0:01:21) = 554s + echo 'rdjoqkol test state = true' 554s + for TEST_SUBSET in $modpath/tests/* 554s rdjoqkol test state = true 554s + echo /usr/lib/python3/dist-packages/pandas/tests/indexes 554s + grep -q -e __pycache__ 554s + PANDAS_CI=1 554s + LC_ALL=C.UTF-8 554s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/indexes 560s ============================= test session starts ============================== 560s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 560s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 560s rootdir: /usr/lib/python3/dist-packages/pandas 560s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 560s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 560s asyncio: mode=Mode.STRICT 560s collected 16998 items / 4 deselected / 16994 selected 560s 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_constructors.py .......s.. 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_formats.py .............. 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_indexing.py ............ 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_pickle.py . 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_reshape.py ..............s.... 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_setops.py ............................................................ 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_where.py . 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_append.py ....... 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_astype.py ........... 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_category.py ......................................... 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_constructors.py ..... 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_equals.py ......ss 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_fillna.py ... 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_formats.py .. 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_indexing.py ................................. 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_map.py ..................... 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_reindex.py ....... 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_setops.py .. 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_drop_duplicates.py ................................................................................................................ 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_equals.py ..................... 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_indexing.py ................ 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_is_monotonic.py . 560s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_nat.py ...................... 561s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_sort_values.py ................................................................................... 561s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_value_counts.py ............................................ 561s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_asof.py .. 561s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_astype.py ................................. 561s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_delete.py ....................... 561s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_factorize.py .................................................................................... 561s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_fillna.py .. 561s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_insert.py ............................................................................................................................................................................................. 561s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_isocalendar.py .. 561s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_map.py ..... 561s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_normalize.py ......... 562s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_repeat.py .................................................................................................................................................................................................................................................................................................................................................... 562s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_resolution.py .................................................................................................................................................................................... 562s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_round.py ...................................................................................................................................................................................................................... 563s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_shift.py ........................................................................................................................................ 563s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_snap.py ........................ 563s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_to_frame.py .. 563s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_to_julian_date.py ..... 563s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_to_period.py ........................................... 563s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_to_pydatetime.py .. 563s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_to_series.py . 563s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_tz_convert.py .................................... 563s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_tz_localize.py ............................................................................................................................................................ 563s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_unique.py ........................ 563s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_arithmetic.py .....................x 564s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_constructors.py ..................................................................................................................................................................................................................x...x.................................... 565s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_date_range.py .................................................................................................................................................................................................................................................................................................................................................. 565s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_datetime.py .................. 565s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_formats.py ........................................ 565s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_freq_attr.py .......................... 566s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_indexing.py .......................................................................................................................................................................................................................................................................................................................................................................................... 566s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_iter.py ............ 566s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_join.py ...................... 566s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_npfuncs.py . 566s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_ops.py ................ 566s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_partial_slicing.py .................................. 566s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_pickle.py ...... 566s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_reindex.py .. 571s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_scalar_compat.py ..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 571s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_setops.py ................................................................................................................................ 571s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_timezones.py ........................................ 572s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_astype.py ....................................x........................................................................................................................... 572s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_constructors.py ............................................................................................................................................................................................................................................................................ssssssss.......................................s.................s.....s.....s.....s....................................ssssssss.......................................s.................s.....s.....s.....s.................................s 572s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_equals.py .... 572s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_formats.py ........... 573s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_indexing.py ............................................................................................................................................................................................................................................................................................ 573s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_interval.py .......x....x....x....x.................................................................................................................................................................................................................................. 574s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_interval_range.py ........................................................................................................................................................ 574s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_interval_tree.py .................................................................................................................................................................................................................... 574s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_join.py ... 574s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_pickle.py ... 574s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_setops.py ................................................................................. 574s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_analytics.py ...................................... 574s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_astype.py ... 574s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_compat.py ...... 575s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_constructors.py ..................................................................................s................. 575s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_conversion.py ...... 575s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_copy.py .......... 575s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_drop.py ............. 575s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_duplicates.py .................................................. 575s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_equivalence.py .............. 575s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_formats.py .............. 575s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_get_level_values.py ....... 575s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_get_set.py ................... 575s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_indexing.py ........................................................................................................................................... 575s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_integrity.py ................ 575s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_isin.py .............. 576s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_join.py ....................................................... 576s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_lexsort.py .. 576s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_missing.py ...x.. 576s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_monotonic.py ........... 576s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_names.py ............................... 576s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_partial_indexing.py ..... 576s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_pickle.py . 576s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_reindex.py ............ 576s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_reshape.py ........... 577s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_setops.py ............................................................................................................................................................................................................sss................................................................... 577s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_sorting.py .......................... 577s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_take.py ... 577s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/numeric/test_astype.py ................... 577s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/numeric/test_indexing.py ..............................................................................................................................................ss.......................................... 577s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/numeric/test_join.py ........... 577s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/numeric/test_numeric.py ................................................................................................ 577s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/numeric/test_setops.py .................... 577s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/object/test_astype.py .. 578s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/object/test_indexing.py .......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.ss. 578s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_asfreq.py ............... 578s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_astype.py ............. 578s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_factorize.py .. 578s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_fillna.py . 578s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_insert.py ... 578s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_is_full.py . 578s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_repeat.py ...... 578s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_shift.py ...... 578s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_to_timestamp.py ........ 578s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_constructors.py .................................................................................................. 578s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_formats.py ................... 578s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_freq_attr.py . 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_indexing.py ......................................................................... 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_join.py ........... 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_monotonic.py .. 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_partial_slicing.py .............. 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_period.py .................................................................................................................................... 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_period_range.py ........................... 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_pickle.py .... 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_resolution.py ......... 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_scalar_compat.py ... 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_searchsorted.py ........ 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_setops.py .............. 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_tools.py ............ 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/ranges/test_constructors.py ............................. 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/ranges/test_indexing.py ............... 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/ranges/test_join.py .......... 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/ranges/test_range.py ................................................................................. 579s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/ranges/test_setops.py ................................................................... 580s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_any_index.py ......................................................................................................................s......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 582s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_base.py .........................................................................................................................................................................x.............................................................................ssss....ss..........ss......ss.................................................................................................................................ssss...........................................................................................................................................................................................................................................s.......................................................................................................s..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s 583s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_common.py ...........................................................................................................................................................................................................xxxxxxxxxxxxxxxxxxxxxxxxxxxxx.........................................................................................................................sssssssss...s....ss..........................xs.....................sss................................................sss....................................................................................s................s...............................................................................................................................................................................................................................................................................XX........................................... 583s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_datetimelike.py ........................................ 583s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_engines.py ......................................... 583s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_frozen.py .......... 583s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_index_new.py ............................................xxxxssss................................................................................................................ 584s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_indexing.py ......................................................ss...............................s.................................................................................................................................................................................................................................................................................................s........................ 585s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_numpy_compat.py ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ss..................... 586s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_old_base.py s...s...................sss.............................ssssssssss.s..........ss.................s.............s.....s..............s..sss..........................................................s.......................................................................ssssssss..s..sssssssss..s..sssssssss..s..sssssssss..s..sssssssss..s..s......................s..............................................s................s..............................s........................ssssssss....s.s...s.....s........sssssssss...s....s...sss...................................................................................................................ss......................ssssss.........................................................................................................................................................................s......................................................................................................................................................................................s...s...........s...s...........................................................................................s...s... 590s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_setops.py ...........................................................................................................................................x........................................................................................................................................................................................................................................................................................................................................................................................................................X..................................................................x....................................................................................................X.........X...............................................................................................................X..........................................................................................................................................................................................................................................................................................................................................................s...........................................................................................................................ss..s.s...s...s.....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ssss....ss..........ss......ss..................................................................................................................................................................................................................................................................ssss....ss..........ss......ss................................................................................................................................................................................................................................................................s...........................................................................................................................................................................................ss 590s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_subclass.py . 590s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/methods/test_astype.py ............... 590s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/methods/test_factorize.py .. 590s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/methods/test_fillna.py . 590s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/methods/test_insert.py ............... 590s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/methods/test_repeat.py . 590s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/methods/test_shift.py ...... 590s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_arithmetic.py ... 590s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_constructors.py ..................... 590s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_delete.py ... 590s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_formats.py ..... 590s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_freq_attr.py ........... 591s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_indexing.py .................................... 591s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_join.py ....... 591s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_ops.py .......... 591s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_pickle.py . 591s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_scalar_compat.py ........ 591s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_searchsorted.py ........ 591s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_setops.py ................................ 591s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_timedelta.py ... 591s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_timedelta_range.py ............................ 591s 591s =============================== warnings summary =============================== 591s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 591s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-jj6wbms1' 591s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 591s 591s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 591s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-j80judde' 591s session.config.cache.set(STEPWISE_CACHE_DIR, []) 591s 591s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 591s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 591s ============================= slowest 30 durations ============================= 591s 0.25s call tests/indexes/period/test_indexing.py::TestGetItem::test_getitem_seconds 591s 0.24s call tests/indexes/test_numpy_compat.py::test_numpy_ufuncs_out[int8] 591s 0.22s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_begin[ns] 591s 0.13s call tests/indexes/ranges/test_setops.py::test_range_difference 591s 0.10s call tests/indexes/multi/test_indexing.py::test_pyint_engine 591s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_end[ms] 591s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_end[us] 591s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_end[s] 591s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_end[ns] 591s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_begin[us] 591s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_begin[s] 591s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_begin[ms] 591s 0.09s teardown tests/indexes/timedeltas/test_timedelta_range.py::TestTimedeltas::test_timedelta_range_deprecated_freq[2.5T-5 hours-5 hours 8 minutes-expected_values1-150s] 591s 0.06s call tests/indexes/period/test_partial_slicing.py::TestPeriodIndex::test_range_slice_seconds[period_range] 591s 0.05s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[left-1] 591s 0.04s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[both-1] 591s 0.04s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[right-1] 591s 0.04s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[neither-1] 591s 0.04s call tests/indexes/datetimes/methods/test_tz_localize.py::TestTZLocalize::test_dti_tz_localize_roundtrip[tzlocal()] 591s 0.03s call tests/indexes/multi/test_sorting.py::test_remove_unused_levels_large[datetime64[D]-str] 591s 0.03s call tests/indexes/datetimes/methods/test_tz_localize.py::TestTZLocalize::test_dti_tz_localize_roundtrip[zoneinfo.ZoneInfo(key='US/Pacific')] 591s 0.03s call tests/indexes/datetimes/test_timezones.py::TestDatetimeIndexTimezones::test_with_tz[tz1] 591s 0.03s call tests/indexes/datetimes/test_timezones.py::TestDatetimeIndexTimezones::test_with_tz[tz0] 591s 0.03s call tests/indexes/datetimes/test_constructors.py::TestDatetimeIndex::test_constructor_datetime64_tzformat[W-SUN] 591s 0.03s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonNano::test_date_range_freq_matches_reso 591s 0.03s call tests/indexes/multi/test_integrity.py::test_consistency 591s 0.02s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[left-10] 591s 0.02s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[right-10] 591s 0.02s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[both-10] 591s 0.02s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[neither-10] 591s = 16688 passed, 254 skipped, 4 deselected, 46 xfailed, 6 xpassed, 2 warnings in 35.02s = 592s + echo 'rdjoqkol test state = true' 592s + for TEST_SUBSET in $modpath/tests/* 592s + echo /usr/lib/python3/dist-packages/pandas/tests/indexing 592s rdjoqkol test state = true 592s + grep -q -e __pycache__ 592s + PANDAS_CI=1 592s + LC_ALL=C.UTF-8 592s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/indexing 595s ============================= test session starts ============================== 595s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 595s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 595s rootdir: /usr/lib/python3/dist-packages/pandas 595s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 595s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 595s asyncio: mode=Mode.STRICT 595s collected 4389 items 595s 595s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/interval/test_interval.py .............................. 595s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/interval/test_interval_new.py ..................... 595s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_chaining_and_caching.py .. 595s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_datetime.py .. 595s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_getitem.py ............................................................................. 595s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_iloc.py ................ 596s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_indexing_slow.py .......... 596s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_loc.py ................................................................................................................................. 596s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_multiindex.py ................ 596s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_partial.py ............. 597s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_setitem.py ........................... 597s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_slice.py ............................. 597s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_sorted.py ......... 597s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_at.py ......................................... 597s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_categorical.py .................s................................................................................................ 598s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_chaining_and_caching.py .............................. 598s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_check_indexer.py ....................s.... 599s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_coercion.py ..........xxxxxxx...............................................................xx............................................xxxx....x............................................................xxxxx..................xx............................................................................................................................................................................................................x 599s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_datetime.py .........ss 599s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_floats.py ............................................................................................................................................... 599s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_iat.py ..... 599s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_iloc.py .................................................................................................................................................................................................................. 599s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_indexers.py ...... 601s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_indexing.py ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 604s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_loc.py .............................................................................................................................x...........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s...................................................................................................................................................................................................................................................................................................................................................................s.................................... 604s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_na_indexing.py .............................................................................................................................................................................................................................................................................. 604s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_partial.py .................................... 605s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_scalar.py ...................................... 605s 605s =============================== warnings summary =============================== 605s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 605s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-d1758tis' 605s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 605s 605s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 605s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-iugnyzwy' 605s session.config.cache.set(STEPWISE_CACHE_DIR, []) 605s 605s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 605s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 605s ============================= slowest 30 durations ============================= 605s 0.30s call tests/indexing/test_loc.py::TestLocBaseIndependent::test_loc_non_unique_memory_error[900000-100000] 605s 0.12s call tests/indexing/test_loc.py::TestLocBaseIndependent::test_loc_getitem_range_from_spmatrix[int64-coo_matrix] 605s 0.10s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[a-2] 605s 0.10s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[a-3] 605s 0.10s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[a-1] 605s 0.10s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[a-4] 605s 0.10s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[a-0] 605s 0.09s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[b-3] 605s 0.09s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[b-4] 605s 0.09s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[b-2] 605s 0.09s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[b-1] 605s 0.08s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[b-0] 605s 0.06s call tests/indexing/test_chaining_and_caching.py::TestChaining::test_detect_chained_assignment_implicit_take2 605s 0.06s call tests/indexing/test_chaining_and_caching.py::TestChaining::test_detect_chained_assignment_setting_entire_column 605s 0.05s call tests/indexing/test_chaining_and_caching.py::TestChaining::test_detect_chained_assignment_str 605s 0.05s call tests/indexing/test_chaining_and_caching.py::TestChaining::test_detect_chained_assignment_implicit_take 605s 0.04s call tests/indexing/multiindex/test_setitem.py::TestMultiIndexSetItem::test_groupby_example 605s 0.03s call tests/indexing/multiindex/test_slice.py::TestMultiIndexSlicers::test_per_axis_per_level_setitem 605s 0.02s call tests/indexing/test_loc.py::TestLocSeries::test_loc_nonunique_masked_index 605s 0.02s call tests/indexing/multiindex/test_slice.py::TestMultiIndexSlicers::test_per_axis_per_level_getitem 605s 0.02s teardown tests/indexing/test_scalar.py::TestMultiIndexScalar::test_multiindex_at_get_one_level 605s 0.01s call tests/indexing/multiindex/test_setitem.py::TestMultiIndexSetItem::test_setitem_multiindex3 605s 0.01s call tests/indexing/multiindex/test_slice.py::TestMultiIndexSlicers::test_per_axis_per_level_doc_examples 605s 0.01s call tests/indexing/test_chaining_and_caching.py::TestChaining::test_detect_chained_assignment_false_positives 605s 0.01s call tests/indexing/multiindex/test_slice.py::TestMultiIndexSlicers::test_loc_axis_arguments 605s 0.01s call tests/indexing/test_loc.py::TestLocCallable::test_frame_loc_setitem_callable 605s 0.01s call tests/indexing/test_loc.py::TestLocCallable::test_frame_loc_getitem_callable 605s 0.01s call tests/indexing/multiindex/test_slice.py::TestMultiIndexSlicers::test_multiindex_slicers_non_unique 605s 0.01s call tests/indexing/test_indexing.py::TestMisc::test_rhs_alignment 605s 0.01s call tests/indexing/test_categorical.py::TestCategoricalIndex::test_ix_categorical_index_non_unique[False] 605s =========== 4360 passed, 6 skipped, 23 xfailed, 2 warnings in 10.83s =========== 605s + echo 'rdjoqkol test state = true' 605s + for TEST_SUBSET in $modpath/tests/* 605s rdjoqkol test state = true 605s + echo /usr/lib/python3/dist-packages/pandas/tests/interchange 605s + grep -q -e __pycache__ 605s + PANDAS_CI=1 605s + LC_ALL=C.UTF-8 605s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/interchange 607s ============================= test session starts ============================== 607s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 607s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 607s rootdir: /usr/lib/python3/dist-packages/pandas 607s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 607s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 607s asyncio: mode=Mode.STRICT 607s collected 140 items 607s 607s ../../../usr/lib/python3/dist-packages/pandas/tests/interchange/test_impl.py ..sssssssss..............sssss........s..s..ssssssssssssssssssssssssssssssss. 607s ../../../usr/lib/python3/dist-packages/pandas/tests/interchange/test_spec_conformance.py ................ 607s ../../../usr/lib/python3/dist-packages/pandas/tests/interchange/test_utils.py ................sssssssssssssssssssssssssssssss 607s 607s =============================== warnings summary =============================== 607s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 607s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-hhddnn9z' 607s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 607s 607s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 607s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-8wo626sw' 607s session.config.cache.set(STEPWISE_CACHE_DIR, []) 607s 607s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 607s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 607s ============================= slowest 30 durations ============================= 607s 0.02s call tests/interchange/test_impl.py::test_dataframe[4] 607s 0.01s call tests/interchange/test_impl.py::test_dataframe[3] 607s 0.01s call tests/interchange/test_impl.py::test_dataframe[0] 607s 0.01s call tests/interchange/test_impl.py::test_dataframe[1] 607s 0.01s call tests/interchange/test_impl.py::test_dataframe[2] 607s 607s (25 durations < 0.005s hidden. Use -vv to show these durations.) 607s ================== 61 passed, 79 skipped, 2 warnings in 0.38s ================== 607s + echo 'rdjoqkol test state = true' 607s + for TEST_SUBSET in $modpath/tests/* 607s rdjoqkol test state = true 607s + echo /usr/lib/python3/dist-packages/pandas/tests/internals 607s + grep -q -e __pycache__ 607s + PANDAS_CI=1 607s + LC_ALL=C.UTF-8 607s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/internals 609s ============================= test session starts ============================== 609s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 609s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 609s rootdir: /usr/lib/python3/dist-packages/pandas 609s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 609s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 609s asyncio: mode=Mode.STRICT 609s collected 257 items 609s 609s ../../../usr/lib/python3/dist-packages/pandas/tests/internals/test_api.py ......... 609s ../../../usr/lib/python3/dist-packages/pandas/tests/internals/test_internals.py .................................................................................................................................................................................................................................................... 612s ../../../usr/lib/python3/dist-packages/pandas/tests/internals/test_managers.py .... 612s 612s =============================== warnings summary =============================== 612s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 612s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-lnlclzb0' 612s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 612s 612s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 612s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-cx02n2l3' 612s session.config.cache.set(STEPWISE_CACHE_DIR, []) 612s 612s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 612s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 612s ============================= slowest 30 durations ============================= 612s 1.22s call tests/internals/test_managers.py::test_array_manager_depr_env_var[block] 612s 1.21s call tests/internals/test_managers.py::test_array_manager_depr_env_var[array] 612s 0.01s call tests/internals/test_internals.py::TestBlockManager::test_equals_block_order_different_dtypes[c:sparse;d:sparse_na;b:f8] 612s 0.01s call tests/internals/test_internals.py::TestCanHoldElement::test_interval_can_hold_element[4-uint64] 612s 0.01s call tests/internals/test_internals.py::TestBlockManager::test_equals_block_order_different_dtypes[a:i8;e:dt;f:td;g:string] 612s 0.01s call tests/internals/test_internals.py::TestCanHoldElement::test_interval_can_hold_element[4-int64] 612s 612s (24 durations < 0.005s hidden. Use -vv to show these durations.) 612s ======================= 257 passed, 2 warnings in 3.19s ======================== 612s + echo 'rdjoqkol test state = true' 612s + for TEST_SUBSET in $modpath/tests/* 612s rdjoqkol test state = true 612s + echo /usr/lib/python3/dist-packages/pandas/tests/io 612s + grep -q -e __pycache__ 612s + PANDAS_CI=1 612s + LC_ALL=C.UTF-8 612s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/io 619s ============================= test session starts ============================== 619s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 619s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 619s rootdir: /usr/lib/python3/dist-packages/pandas 619s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 619s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 619s asyncio: mode=Mode.STRICT 619s collected 15511 items / 201 deselected / 2 skipped / 15310 selected 619s 619s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_odf.py ..... 619s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_odswriter.py .......... 620s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_openpyxl.py ...................................................... 630s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_readers.py ......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss....................................ssssss......ssssssssssssssssssssssssssssss............ss..ssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss............ss..ssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss............ss..sssssssssss..s..s.sssssssssssssssssss..s..s.sssssssssssssssssssssssssssssss............ss..ssssssssss......s.sssssssssssssssssss..ssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss.xx.xxs.sssss......s.sssss......s.sssss......s.sssss....................................ssssss......ssssssssssssssssssssssssssssss......s.sssss......s.sssss......s.sssssssssssssssssssssssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss............ss..ssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss....................................ssssss......ssssssssssssssssssssssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......sssssss......sssssss......s.sssss......s.sssss......s.sssss......s.sssss..................sss...sssssssssssssss......s.sssss......s.sssss....................................ssssss......ssssssssssssssssssssssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss............ss..ssssssssss......s.sssss......s.sssss......sssssss...ssss.sssss 632s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_style.py ...................................................................................................................................s 641s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_writers.py .......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 641s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_xlrd.py ....... 641s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_xlsxwriter.py ..... 641s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_bar.py ..........................................................................................................................s 641s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_exceptions.py ... 642s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_format.py ......................................................................................................... 642s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_highlight.py ................................................................................................ 642s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_html.py ........................................................................................... 642s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_matplotlib.py ........................................................... 642s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_non_unique.py ......... 643s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_style.py ................................................................................................................................................................................................. 643s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_to_latex.py ............................................................................................................................................................ 643s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_to_string.py ..... 643s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_tooltip.py .... 643s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_console.py ........... 643s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_css.py ............................................................................................... 644s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_eng_formatting.py ....... 645s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_format.py .................................................................................................................................................................................. 645s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_ipython_compat.py ss..s 645s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_printing.py ......... 645s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_to_csv.py ..........s.................................................................................. 645s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_to_excel.py .......................................................................................................................................................................................................................................................................................................................................................................................................................... 646s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_to_html.py .....................................................................................................................................................................................................................................................................................................................................................................s................... 646s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_to_latex.py ............................................................................................ 646s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_to_markdown.py .......... 646s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_to_string.py ...............................................s........................................... 647s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_compression.py ........sssssss.................s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s....... 647s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_deprecated_kwargs.py . 647s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_json_table_schema.py ...........................................................................................x...x...x...x...x...x...x...x...x...x......x...x...x...x...x...x...x...x...x...x........................... 647s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_json_table_schema_ext_dtype.py ..................... 648s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_normalize.py ..................................................... 650s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_pandas.py ...........................................................................................................xxxx................................xxx........................................................................................................................................................................s........xxxxxxxxxxxxxxxxxx................................................................xx.............s.....x........ssssssssssssssssssssssssssssssss...sssss.s. 650s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_readlines.py ..s.s.....ss.s.s.s.s......ssss....s..ss..ss.s.s........... 651s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_ujson.py ................................................................................................................................................................................................................... 651s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_chunksize.py ......ss.........sss......ss...s...s......ss...s...s...s......ss...s...s...s 652s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_common_basic.py ....s...s...s...ss..s...s......ss.........sss...s...s...s...s...s.........sss............ssss......ss...s...s......ss.........sss...s......ss...s.........sss...s...s......ss...s...s...............sssss...s...s...s......ss...s......ss......ss...s...s...s...s...s...s 652s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_data_list.py ...s...s...s...s 652s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_decimal.py ......ss...s 653s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_file_buffer_url.py ...s...s...s...s...s.................................sssssssssss...s...s...s...s...s...s...s...s............ssss.....................sssssss...s...s...s...s 653s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_float.py ...s.......s.....................sss....x.....x...ss 653s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_index.py ......ss......ss...s............ssss...s...s...s...s...s...s...s 653s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_inf.py ......ss......ss...s 653s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_ints.py ...s............ssss...s......ss...s.........sss.........sss......ss......ss...s 654s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_iterator.py ...s...s...s.........sss...s 654s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_read_errors.py ...s...s...s.........sss...s..................ssssss...s...s...s...s...s..xs...s...s...s...s 654s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_verbose.py ...s...s 654s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/dtypes/test_categorical.py .........sss......ss...s...s...s...s...s...s........................ssssssss...s...s...s...s...s............ssss...s 656s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/dtypes/test_dtypes_basic.py ............ssss...s...s...s...s................................................ssssssssssssssss...s...s.............................x.x.x...................................................................................................................................................................x.x.x.x.x.x.x.x.x.x.x.x.........................................................................................................................................................................................................................................................................................................................x.x.x.x.x.x.x.x.x.x.x.x...................................................................................................................................................................................................................s......ss...s...s........................ssssssss......ss...s...s...s...sssssssss...sssss...sssssssssssssssssssssssssssss...s...s 656s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/dtypes/test_empty.py ...s...s...s...s...s...s...s...s........................ssssssss 657s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_c_parser_only.py .......................................................................... 657s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_comment.py ......ss.........sss...s...s...s..................ssssss......ssxx.s 658s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_compression.py .........sss......ss...s...s.x.x.xss......ss.x.x.xss......ss.x.x.xss......ss.x.x.xss......ss.x.x.xss......ss.x.x.xss......ss..................ssssss......................................................ssssssssssssssssss.........sss...s...s...s 658s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_concatenate_chunks.py ss 658s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_converters.py ...s............ssss...s...s...s......ss...s...s 658s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_dialect.py ...s...s...s......................................................ssssssssssssssssss..................ssssss 659s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_encoding.py ...s...s..................ssssss...s...s...............sssss......................................................ssssssssssssssssss.........sss............................................................................................................ssssssssssssssssssssssssssssssssssss...s....................sssss...s...s......ss 659s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_header.py ...s...s......ss......ss...s...s...s.........sss.........sss.........sss.........sss...s...s...s...s......ss......ss......ss......ss...s.........sss........................ssssssss...s...s...s...s...s...s...s...s...ss 660s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_index_col.py ......ss...s...s...s..............................ssssssssss...s...............sssss...s...s...s...s...s...s...s...s......ss...s 660s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_mangle_dupes.py ...s...s...s.........sss.........sss...s...s...s......ss 660s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_na_values.py ...s...s..........................................ssssssssssssss...s......ss...s...s.........sss............ssss...s...s...s......ss......ss...s......ss...s...s.........sss...s......ss...s......ss..................ssssss...s...s...s...s...s 666s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_network.py ................ssss.....ssssssssssssssssss 667s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_parse_dates.py ...s...s...s......ss..............ss...s...s...s...s......ss...s...s...s...s...sxxxxxxss......ss......ss......ss...s.........sss...s...s...s......ss.........sss............ssss......ss...s......ss......ss............ssss...s......ss...s......ss......ss...s...s...s............ssss...s..................ssssss.........sss......ss............................sssssssss...s...s......ss...s...s......s......ss...s...s...s......ss...s...s...s 667s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_python_parser_only.py ..................................................................................... 668s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_quoting.py .........sss......ss...s..................ssssss............ssss...............sssss......ss......ss......ss 668s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_read_fwf.py ..........................................................................sss.. 668s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_skiprows.py ......ss...s...s.........sss...s.........sss........xsss...s......ss...s...s...s...s...s 668s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_textreader.py ..................................... 668s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_unsupported.py ..........s...s..xs...s 668s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_upcast.py ...........................ssss 668s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/usecols/test_parse_dates.py ......ss...s...s...s............ssss 668s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/usecols/test_strings.py ...s...s......ss......ss 669s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/usecols/test_usecols_basic.py ...s......ss...s......ss...s...s...s......ss............ssss...s...s...s...s...s...s......ss...s...s......ss......ss......ss.....................sssssss......ss......ss...s...s...s 670s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_append.py ....x................ 670s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_categorical.py ..... 670s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_compat.py .... 670s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_complex.py ......... 671s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_errors.py ................ 675s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py ...................................................xxxxxxxxx................................................................................................................................... 675s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_keys.py .... 675s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_put.py ...................... 675s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_pytables_missing.py s 675s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_read.py ....................s 675s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_retain_attributes.py ..... 676s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_round_trip.py ..............................s 678s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_select.py ...x.....x............... 680s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_store.py ............................................x..................... 680s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_subclass.py .. 680s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_time_series.py .... 681s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_timezones.py .......................................................................... 682s ../../../usr/lib/python3/dist-packages/pandas/tests/io/sas/test_byteswap.py .......... 682s ../../../usr/lib/python3/dist-packages/pandas/tests/io/sas/test_sas.py ... 682s ../../../usr/lib/python3/dist-packages/pandas/tests/io/sas/test_sas7bdat.py ....................... 683s ../../../usr/lib/python3/dist-packages/pandas/tests/io/sas/test_xport.py ....... 683s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_clipboard.py .......QStandardPaths: XDG_RUNTIME_DIR not set, defaulting to '/tmp/runtime-ubuntu' 684s ...............................................................................................................................................................................................................................................................................................................................ssssss.. 684s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_common.py .......................................................s....s........ss.......s.........s.......s......................................... 687s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_compression.py ........................................................................................................................................ 687s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_fsspec.py .........ssssssssss..........s 687s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_gbq.py .. 687s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_gcs.py ssssssssssssssssss. 695s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_html.py .........ssssss...............................................................................................................................s.................... 700s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_http_headers.py ...ss.......ss.......ss....ss 700s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_parquet.py .ssss.ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.ssssssss 707s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_pickle.py ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. 707s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_s3.py sss 707s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_spss.py .......s.F 718s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py ssss....ssssss....ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss...ssss....ssssss...ssss...ssss...ssss...xssssssssssssssxxxxssssss...ssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss...xssssss...xssssss...xssssss....ssssss....ssssss....ssssss...xssssss...xssssss...xssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss...xssssss...xssssss...xss.ssss...xssssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss....ssssss....ssssss...ssss...ssss...xss.ss...ssss..sssss..sssss..s.ssssss....ssssss...ssss...ssss...ssssxxxssss...ssssxxxssssssssssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss....ssssss....ssssss....ssssss..xssss..sssss..sssss..sssss...ssss...ssss...ssss...ssss..sssss...xssssss....ssssss...ssss....ss..ssss....ssssss....ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssxxxxssssssxxxxssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ss....s..ss................ 723s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_stata.py ..............................................................................................................................................................................................................................................................................................................................................................................s.................................................................................................................................................................................................................................................... 724s ../../../usr/lib/python3/dist-packages/pandas/tests/io/xml/test_to_xml.py ........................................................................s...............................................................ss 728s ../../../usr/lib/python3/dist-packages/pandas/tests/io/xml/test_xml.py ........................................................................................s............................................................................s......s......s......s......s......s......s......s......s......s......s......s..s.sss.sss. 729s ../../../usr/lib/python3/dist-packages/pandas/tests/io/xml/test_xml_dtypes.py .............................................. 729s 729s =================================== FAILURES =================================== 729s ______________________________ test_spss_metadata ______________________________ 729s 729s datapath = .deco at 0x7f7a1f40f060> 729s 729s @pytest.mark.filterwarnings("ignore::pandas.errors.ChainedAssignmentError") 729s @pytest.mark.filterwarnings("ignore:ChainedAssignmentError:FutureWarning") 729s def test_spss_metadata(datapath): 729s # GH 54264 729s fname = datapath("io", "data", "spss", "labelled-num.sav") 729s 729s df = pd.read_spss(fname) 729s metadata = { 729s "column_names": ["VAR00002"], 729s "column_labels": [None], 729s "column_names_to_labels": {"VAR00002": None}, 729s "file_encoding": "UTF-8", 729s "number_columns": 1, 729s "number_rows": 1, 729s "variable_value_labels": {"VAR00002": {1.0: "This is one"}}, 729s "value_labels": {"labels0": {1.0: "This is one"}}, 729s "variable_to_label": {"VAR00002": "labels0"}, 729s "notes": [], 729s "original_variable_types": {"VAR00002": "F8.0"}, 729s "readstat_variable_types": {"VAR00002": "double"}, 729s "table_name": None, 729s "missing_ranges": {}, 729s "missing_user_values": {}, 729s "variable_storage_width": {"VAR00002": 8}, 729s "variable_display_width": {"VAR00002": 8}, 729s "variable_alignment": {"VAR00002": "unknown"}, 729s "variable_measure": {"VAR00002": "unknown"}, 729s "file_label": None, 729s "file_format": "sav/zsav", 729s } 729s if Version(pyreadstat.__version__) >= Version("1.2.4"): 729s metadata.update( 729s { 729s "creation_time": datetime.datetime(2015, 2, 6, 14, 33, 36), 729s "modification_time": datetime.datetime(2015, 2, 6, 14, 33, 36), 729s } 729s ) 729s > assert df.attrs == metadata 729s E AssertionError: assert {'column_labe... 33, 36), ...} == {'column_labe... 33, 36), ...} 729s E 729s E Omitting 23 identical items, use -vv to show 729s E Left contains 1 more item: 729s E {'mr_sets': {}} 729s E Use -v to get more diff 729s 729s /usr/lib/python3/dist-packages/pandas/tests/io/test_spss.py:165: AssertionError 729s =============================== warnings summary =============================== 729s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 729s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-a5q83si5' 729s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 729s 729s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:429 729s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:429: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/lastfailed: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-a7qjn5eq' 729s config.cache.set("cache/lastfailed", self.lastfailed) 729s 729s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 729s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-o83bzlu8' 729s session.config.cache.set(STEPWISE_CACHE_DIR, []) 729s 729s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 729s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 729s ============================= slowest 30 durations ============================= 729s 1.26s call tests/io/test_compression.py::test_with_missing_lzma_runtime 729s 1.20s call tests/io/test_compression.py::test_with_missing_lzma 729s 1.16s call tests/io/pytables/test_file_handling.py::test_complibs[blosc2-1] 729s 1.02s call tests/io/pytables/test_store.py::test_no_track_times 729s 0.84s call tests/io/test_html.py::TestReadHtml::test_banklist_url[bs4] 729s 0.82s call tests/io/test_pickle.py::test_pickle_big_dataframe_compression[gzip-4] 729s 0.81s call tests/io/test_pickle.py::test_pickle_big_dataframe_compression[gzip-5] 729s 0.50s teardown tests/io/parser/test_network.py::test_compressed_urls[zstd-c-explicit] 729s 0.50s teardown tests/io/parser/test_network.py::test_url_encoding_csv 729s 0.50s teardown tests/io/parser/test_network.py::test_compressed_urls[xz-python-infer] 729s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[storage_options1-json_responder-read_json] 729s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[storage_options1-csv_responder-read_csv] 729s 0.50s teardown tests/io/parser/test_network.py::test_compressed_urls[xz-c-infer] 729s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[None-gz_json_responder-read_json] 729s 0.50s teardown tests/io/excel/test_readers.py::TestReaders::test_read_from_http_url[(None, '.xls')] 729s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[storage_options1-gz_csv_responder-read_csv] 729s 0.50s teardown tests/io/test_html.py::TestReadHtml::test_multiple_matches[bs4] 729s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[None-gz_csv_responder-read_csv] 729s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[None-stata_responder-read_stata] 729s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[None-json_responder-read_json] 729s 0.50s teardown tests/io/test_html.py::TestReadHtml::test_python_docs_table[bs4] 729s 0.50s teardown tests/io/xml/test_xml.py::test_wrong_url[lxml] 729s 0.50s teardown tests/io/xml/test_xml.py::test_wrong_url[etree] 729s 0.50s teardown tests/io/parser/common/test_file_buffer_url.py::test_url[c_low] 729s 0.50s teardown tests/io/parser/test_network.py::test_compressed_urls[bz2-c-infer] 729s 0.50s teardown tests/io/xml/test_xml.py::test_url_path_error[lxml] 729s 0.50s teardown tests/io/test_html.py::TestReadHtml::test_bad_url_protocol[bs4] 729s 0.50s teardown tests/io/test_html.py::TestReadHtml::test_bad_url_protocol[lxml] 729s 0.50s teardown tests/io/xml/test_xml.py::test_url_path_error[etree] 729s 0.50s teardown tests/io/parser/test_network.py::test_compressed_urls[zstd-c-infer] 729s =========================== short test summary info ============================ 729s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_spss.py::test_spss_metadata 729s = 1 failed, 12141 passed, 3015 skipped, 201 deselected, 155 xfailed, 3 warnings in 115.21s (0:01:55) = 730s rdjoqkol test state = false 730s + test 1 == 5 730s + TEST_SUCCESS=false 730s + echo 'rdjoqkol test state = false' 730s + for TEST_SUBSET in $modpath/tests/* 730s + echo /usr/lib/python3/dist-packages/pandas/tests/libs 730s + grep -q -e __pycache__ 730s + PANDAS_CI=1 730s + LC_ALL=C.UTF-8 730s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/libs 732s ============================= test session starts ============================== 732s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 732s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 732s rootdir: /usr/lib/python3/dist-packages/pandas 732s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 732s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 732s asyncio: mode=Mode.STRICT 732s collected 2279 items 732s 734s ../../../usr/lib/python3/dist-packages/pandas/tests/libs/test_hashtable.py ..............s.....................................................................s.............s.......................................................s.............s.................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s................................................................................................................................................................................................. 734s ../../../usr/lib/python3/dist-packages/pandas/tests/libs/test_join.py ................. 734s ../../../usr/lib/python3/dist-packages/pandas/tests/libs/test_lib.py .................................................................................. 734s ../../../usr/lib/python3/dist-packages/pandas/tests/libs/test_libalgos.py ........ 734s 734s =============================== warnings summary =============================== 734s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 734s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-u41eujk5' 734s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 734s 734s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 734s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-8izy2lks' 734s session.config.cache.set(STEPWISE_CACHE_DIR, []) 734s 734s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 734s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 734s ============================= slowest 30 durations ============================= 734s 0.01s teardown tests/libs/test_libalgos.py::TestInfinity::test_infinity_against_nan 734s 0.01s call tests/libs/test_libalgos.py::test_groupsort_indexer 734s 734s (28 durations < 0.005s hidden. Use -vv to show these durations.) 734s ================= 2273 passed, 6 skipped, 2 warnings in 2.35s ================== 734s rdjoqkol test state = false 734s + echo 'rdjoqkol test state = false' 734s + for TEST_SUBSET in $modpath/tests/* 734s + echo /usr/lib/python3/dist-packages/pandas/tests/plotting 734s + grep -q -e __pycache__ 734s + PANDAS_CI=1 734s + LC_ALL=C.UTF-8 734s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/plotting 737s ============================= test session starts ============================== 737s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 737s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 737s rootdir: /usr/lib/python3/dist-packages/pandas 737s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 737s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 737s asyncio: mode=Mode.STRICT 737s collected 1423 items / 212 deselected / 1211 selected 737s 750s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/frame/test_frame.py ..................................................XX.........................s.s.s.s....................................................................................................x...................................................................................... 754s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/frame/test_frame_color.py ......................................................................................... 754s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/frame/test_frame_groupby.py ...... 755s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/frame/test_frame_legend.py x..................... 760s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/frame/test_frame_subplots.py .........x....................XX................................................................... 761s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/frame/test_hist_box_by.py ............................. 761s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_backend.py .....s. 763s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_boxplot_method.py .................................................... 763s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_common.py ... 766s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_converter.py ............................................ 777s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_datetimelike.py ...............................................................................................................x..........................................x.......................x...............x..... 777s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_groupby.py ................. 780s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_hist_method.py ...........................x..x...................................................... 786s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_misc.py s....................sss...................sss...................sss.................................. 789s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_series.py ...............................XXXX.............................x........................................................x......................... 789s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_style.py ...................................... 789s 789s =============================== warnings summary =============================== 789s tests/plotting/frame/test_frame.py: 11 warnings 789s /usr/lib/python3/dist-packages/matplotlib/transforms.py:2652: RuntimeWarning: divide by zero encountered in scalar divide 789s x_scale = 1.0 / inw 789s 789s tests/plotting/frame/test_frame.py: 11 warnings 789s /usr/lib/python3/dist-packages/matplotlib/transforms.py:2654: RuntimeWarning: invalid value encountered in scalar multiply 789s self._mtx = np.array([[x_scale, 0.0 , (-inl*x_scale)], 789s 789s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 789s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-o98hj4qd' 789s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 789s 789s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 789s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-v3wmthod' 789s session.config.cache.set(STEPWISE_CACHE_DIR, []) 789s 789s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 789s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 789s ============================= slowest 30 durations ============================= 789s 1.46s call tests/plotting/test_converter.py::test_registry_mpl_resets 789s 1.38s call tests/plotting/test_converter.py::TestRegistration::test_dont_register_by_default 789s 0.53s call tests/plotting/test_misc.py::TestDataFramePlots::test_scatter_matrix_axis_smaller[True] 789s 0.44s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_errorbar_timeseries[bar] 789s 0.43s call tests/plotting/test_misc.py::TestSeriesPlots::test_bootstrap_plot 789s 0.39s call tests/plotting/test_datetimelike.py::TestTSPlot::test_finder_daily 789s 0.36s call tests/plotting/test_misc.py::TestDataFramePlots::test_scatter_matrix_axis[True] 789s 0.36s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_errorbar_timeseries[barh] 789s 0.33s call tests/plotting/frame/test_frame_subplots.py::TestDataFramePlotsSubplots::test_subplots_ts_share_axes 789s 0.30s call tests/plotting/test_misc.py::TestDataFramePlots::test_scatter_matrix_axis_smaller[False] 789s 0.29s call tests/plotting/test_misc.py::TestDataFramePlots::test_scatter_matrix_axis[False] 789s 0.29s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_kde_df 789s 0.29s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_hist_df_coord[data1] 789s 0.28s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_errorbar_timeseries[line] 789s 0.27s call tests/plotting/frame/test_frame_subplots.py::TestDataFramePlotsSubplots::test_subplots_dup_columns_secondary_y 789s 0.27s call tests/plotting/test_boxplot_method.py::TestDataFrameGroupByPlots::test_boxplot_legacy3[False-None-1-layout1] 789s 0.25s call tests/plotting/test_boxplot_method.py::TestDataFrameGroupByPlots::test_boxplot_legacy3[True-UserWarning-3-layout0] 789s 0.25s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_sharey_and_ax 789s 0.23s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_df_gridspec_patterns_vert_horiz 789s 0.23s call tests/plotting/test_hist_method.py::TestDataFrameGroupByPlots::test_histtype_argument[bar-True] 789s 0.23s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_bar_bottom_left_subplots 789s 0.23s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_hist_df 789s 0.22s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_sharex_and_ax 789s 0.22s call tests/plotting/frame/test_frame_subplots.py::TestDataFramePlotsSubplots::test_subplots_constrained_layout 789s 0.20s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_hist_df_coord[data0] 789s 0.20s call tests/plotting/test_boxplot_method.py::TestDataFramePlots::test_stacked_boxplot_set_axis 789s 0.19s call tests/plotting/test_datetimelike.py::TestTSPlot::test_line_plot_inferred_freq[W] 789s 0.18s call tests/plotting/test_misc.py::test_savefig[index2-data0-hexbin] 789s 0.18s call tests/plotting/test_misc.py::test_savefig[index0-data0-hexbin] 789s 0.18s call tests/plotting/test_misc.py::test_savefig[index1-data0-hexbin] 789s = 1177 passed, 15 skipped, 212 deselected, 11 xfailed, 8 xpassed, 24 warnings in 53.49s = 790s rdjoqkol test state = false 790s + echo 'rdjoqkol test state = false' 790s + for TEST_SUBSET in $modpath/tests/* 790s + echo /usr/lib/python3/dist-packages/pandas/tests/reductions 790s + grep -q -e __pycache__ 790s + PANDAS_CI=1 790s + LC_ALL=C.UTF-8 790s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/reductions 792s ============================= test session starts ============================== 792s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 792s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 792s rootdir: /usr/lib/python3/dist-packages/pandas 792s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 792s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 792s asyncio: mode=Mode.STRICT 792s collected 542 items 792s 792s ../../../usr/lib/python3/dist-packages/pandas/tests/reductions/test_reductions.py .....................................................................................................................................................................................................................................................................................................................................s............................................................................................................................... 793s ../../../usr/lib/python3/dist-packages/pandas/tests/reductions/test_stat_reductions.py ......................................................................................... 793s 793s =============================== warnings summary =============================== 793s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 793s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-uksjf259' 793s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 793s 793s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 793s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-gf2kfp1u' 793s session.config.cache.set(STEPWISE_CACHE_DIR, []) 793s 793s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 793s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 793s ============================= slowest 30 durations ============================= 793s 0.24s call tests/reductions/test_stat_reductions.py::TestSeriesStatReductions::test_skew 793s 0.04s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_float[float64-False] 793s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_float[float64-True] 793s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_int[int64-True] 793s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_int[int64-False] 793s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_int[int32-True] 793s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_float[float32-True] 793s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_int[int32-False] 793s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_float[float32-False] 793s 793s (21 durations < 0.005s hidden. Use -vv to show these durations.) 793s ================== 541 passed, 1 skipped, 2 warnings in 1.37s ================== 793s + echo 'rdjoqkol test state = false' 793s + for TEST_SUBSET in $modpath/tests/* 793s rdjoqkol test state = false 793s + echo /usr/lib/python3/dist-packages/pandas/tests/resample 793s + grep -q -e __pycache__ 793s + PANDAS_CI=1 793s + LC_ALL=C.UTF-8 793s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/resample 795s ============================= test session starts ============================== 795s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 795s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 795s rootdir: /usr/lib/python3/dist-packages/pandas 795s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 795s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 795s asyncio: mode=Mode.STRICT 795s collected 4179 items 795s 798s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_base.py ..............................................................................................................................................................xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx............................................................................................................................................................................................................................................................................................................................................................................................................................................................. 810s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_datetime_index.py ................................................................................ssss............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x...x...x...x..........................................................................................................................................................................................................................................................................................................................................................................ss 813s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_period_index.py .................................................................................................................................................................................................................................................................................................................................................................................................................................x.................................................................................. 814s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_resample_api.py ..................................................................................................................................................................................................... 815s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_resampler_grouper.py s................................................... 815s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_time_grouper.py .........................x...................... 815s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_timedelta.py .......................s 815s 815s =============================== warnings summary =============================== 815s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 815s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-pvfuk1zy' 815s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 815s 815s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 815s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-vk2pvwqb' 815s session.config.cache.set(STEPWISE_CACHE_DIR, []) 815s 815s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 815s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 815s ============================= slowest 30 durations ============================= 815s 0.11s call tests/resample/test_datetime_index.py::test_resample_dtype_coercion[s] 815s 0.09s call tests/resample/test_period_index.py::TestPeriodIndex::test_basic_upsample[D] 815s 0.08s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-43200-s-0.5-D-3] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-21600-s-0.25-D-3] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-43200-s-0.5-D-1] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-21600-s-0.25-D-2] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-86400-s-1-D-3] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-86400-s-1-D-2] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-21600-s-0.25-D-1] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-86400-s-1-D-1] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-43200-s-0.5-D-2] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-30-s-0.5-Min-1] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-86400-s-1-D-2] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-30-s-0.5-Min-1] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-30-s-0.5-Min-2] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-30-s-0.5-Min-3] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-30-s-0.5-Min-3] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-21600-s-0.25-D-1] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-3600-s-1-h-1] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-60-s-1-Min-1] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-60-s-1-Min-2] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-3600-s-1-h-3] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-3600-s-1-h-1] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-3600-s-1-h-1] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-3600-s-1-h-3] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-3600-s-1-h-2] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-86400-s-1-D-1] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-21600-s-0.25-D-3] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-21600-s-0.25-D-1] 815s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-30-s-0.5-Min-1] 815s =========== 4117 passed, 8 skipped, 54 xfailed, 2 warnings in 20.41s =========== 815s + echo 'rdjoqkol test state = false' 815s rdjoqkol test state = false 815s + for TEST_SUBSET in $modpath/tests/* 815s + echo /usr/lib/python3/dist-packages/pandas/tests/reshape 815s + grep -q -e __pycache__ 815s + PANDAS_CI=1 815s + LC_ALL=C.UTF-8 815s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/reshape 818s ============================= test session starts ============================== 818s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 818s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 818s rootdir: /usr/lib/python3/dist-packages/pandas 818s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 818s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 818s asyncio: mode=Mode.STRICT 818s collected 2610 items / 1 deselected / 2609 selected 818s 818s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_append.py .................................................................................. 818s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_append_common.py ...........................sssssssss....................................................... 818s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_categorical.py ............. 818s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_concat.py .............................................................................................. 818s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_dataframe.py ..................... 819s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_datetimes.py ..................................................................................................x......... 819s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_empty.py .....................s.....s.....s.....s.....s.......... 819s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_index.py ............................................................. 819s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_invalid.py ....... 819s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_series.py ............. 819s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_sort.py .......... 819s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_join.py .......s..........................s....................................... 822s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_merge.py ..............................................................................................................................................................................................................................................................................................................................................................................................................................ssssssss..............................................................................................................................................................................................................................ss..........................................................................................................................................................................................................................................................................ssss........ 823s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_merge_asof.py .s...................................................................................................s.....................s....s.s.s.s.............sss.. 823s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_merge_cross.py ................. 823s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_merge_index_as_string.py ................................................................................ 823s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_merge_ordered.py ..................... 824s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_multi.py .....s.s.................................. 824s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_crosstab.py ..................................... 825s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_cut.py ..................................................................................................................................... 825s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_from_dummies.py ......................................... 825s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_get_dummies.py ...................................................................................................................................................ss 826s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_melt.py ..........................................................s 827s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_pivot.py ...........................................................................xx................................................................................................. 827s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_pivot_multilevel.py .......... 828s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_qcut.py ................................................................................ 828s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_union_categoricals.py .......................................... 828s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_util.py ................. 828s 828s =============================== warnings summary =============================== 828s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 828s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-ghysbmp1' 828s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 828s 828s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 828s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-dpgs74v3' 828s session.config.cache.set(STEPWISE_CACHE_DIR, []) 828s 828s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 828s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 828s ============================= slowest 30 durations ============================= 828s 0.07s call tests/reshape/test_pivot.py::TestPivotTable::test_pivot_multi_functions 828s 0.06s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_normalize 828s 0.06s call tests/reshape/merge/test_join.py::TestJoin::test_full_outer_join 828s 0.05s call tests/reshape/merge/test_merge.py::TestMerge::test_merge_non_unique_indexes 828s 0.05s call tests/reshape/test_pivot.py::TestPivotTable::test_margins 828s 0.04s call tests/reshape/concat/test_concat.py::TestConcatenate::test_concat_order 828s 0.04s call tests/reshape/test_crosstab.py::TestCrosstab::test_margin_dropna6 828s 0.04s call tests/reshape/test_crosstab.py::TestCrosstab::test_margin_normalize 828s 0.04s call tests/reshape/merge/test_join.py::TestJoin::test_right_outer_join 828s 0.03s call tests/reshape/test_pivot.py::TestPivotTable::test_pivot_timegrouper 828s 0.03s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_normalize_arrays 828s 0.03s call tests/reshape/test_crosstab.py::test_categoricals[category-category] 828s 0.02s call tests/reshape/merge/test_multi.py::TestMergeMulti::test_compress_group_combinations 828s 0.02s call tests/reshape/test_crosstab.py::test_categoricals[category-int64] 828s 0.02s call tests/reshape/test_crosstab.py::test_categoricals[int64-category] 828s 0.02s call tests/reshape/test_pivot.py::TestPivotTable::test_daily 828s 0.02s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_margins_set_margin_name 828s 0.02s call tests/reshape/test_qcut.py::test_qcut_binning_issues 828s 0.02s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_ndarray[list] 828s 0.02s call tests/reshape/test_crosstab.py::test_categoricals[int64-int64] 828s 0.02s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_ndarray[tuple] 828s 0.02s call tests/reshape/test_pivot.py::TestPivotTable::test_pivot_timegrouper_double 828s 0.02s call tests/reshape/merge/test_join.py::TestJoin::test_join_many_non_unique_index 828s 0.02s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_ndarray[array] 828s 0.02s call tests/reshape/merge/test_merge.py::TestMerge::test_validation 828s 0.02s call tests/reshape/test_pivot.py::TestPivotTable::test_pivot_table_margins_name_with_aggfunc_list 828s 0.02s call tests/reshape/merge/test_join.py::TestJoin::test_left_outer_join 828s 0.02s call tests/reshape/merge/test_merge.py::TestMerge::test_merge_indicator_result_integrity 828s 0.02s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_duplicate_names 828s 0.02s call tests/reshape/merge/test_multi.py::TestMergeMulti::test_left_join_multi_index[True-False] 828s ==== 2561 passed, 45 skipped, 1 deselected, 3 xfailed, 2 warnings in 11.19s ==== 828s + echo 'rdjoqkol test state = false' 828s + for TEST_SUBSET in $modpath/tests/* 828s rdjoqkol test state = false 828s + echo /usr/lib/python3/dist-packages/pandas/tests/scalar 828s + grep -q -e __pycache__ 828s + PANDAS_CI=1 828s + LC_ALL=C.UTF-8 828s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/scalar 832s ============================= test session starts ============================== 832s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 832s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 832s rootdir: /usr/lib/python3/dist-packages/pandas 832s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 832s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 832s asyncio: mode=Mode.STRICT 832s collected 4329 items 832s 832s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/interval/test_arithmetic.py ............................................ 832s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/interval/test_constructors.py ......... 832s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/interval/test_contains.py ................ 832s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/interval/test_formats.py . 832s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/interval/test_interval.py ............................................ 832s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/interval/test_overlaps.py ................................................................................................................................................................. 832s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/period/test_arithmetic.py .................................................................................... 832s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/period/test_asfreq.py ....................... 832s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/period/test_period.py ............................................................................................................................................................................................................................................................................................................... 832s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/test_na_scalar.py .....................................................................................ss.....ss.....ss................................................................................................................................................................................ 833s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/test_nat.py ........................................................................................................................s............s............................................................................................................................................................................................................................ 833s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timedelta/methods/test_as_unit.py .... 833s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timedelta/methods/test_round.py ................... 833s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timedelta/test_arithmetic.py ................................................................................................................................ 834s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timedelta/test_constructors.py ................................................................................................................................................................................................................................................................................................................... 834s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timedelta/test_formats.py .............. 834s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timedelta/test_timedelta.py .................................................................x............ 834s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_as_unit.py .... 834s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_normalize.py ................................................................................................................................................................. 834s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_replace.py ............................................................................................................................ 835s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_round.py ....................................................................................................................................................................................... 835s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_timestamp_method.py . 835s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_to_julian_date.py ..... 835s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_to_pydatetime.py ....... 835s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_tz_convert.py ............................................................................... 836s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_tz_localize.py ................................................................................................................................................................................................. 836s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/test_arithmetic.py ............................................................................................................. 836s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/test_comparisons.py .............................. 836s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/test_constructors.py ..................................................................xxx.............................................................. 836s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/test_formats.py ........................................................................... 839s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/test_timestamp.py .....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x....................................................................................................................................................................................................................................................................................................................................... 839s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/test_timezones.py .................... 839s 839s =============================== warnings summary =============================== 839s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 839s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-z9mea09y' 839s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 839s 839s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 839s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-7zi4dhwu' 839s session.config.cache.set(STEPWISE_CACHE_DIR, []) 839s 839s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 839s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 839s ============================= slowest 30 durations ============================= 839s 0.35s call tests/scalar/timedelta/test_timedelta.py::TestTimedeltas::test_hash_equality_invariance 839s 0.20s call tests/scalar/timestamp/test_timestamp.py::TestTimestampProperties::test_dow_parametric 839s 0.14s call tests/scalar/timestamp/methods/test_round.py::TestTimestampRound::test_round_sanity[round] 839s 0.14s call tests/scalar/timestamp/methods/test_round.py::TestTimestampRound::test_round_sanity[floor] 839s 0.14s call tests/scalar/timestamp/methods/test_round.py::TestTimestampRound::test_round_sanity[ceil] 839s 0.13s call tests/scalar/timedelta/methods/test_round.py::TestTimedeltaRound::test_round_sanity[round] 839s 0.13s call tests/scalar/timedelta/methods/test_round.py::TestTimedeltaRound::test_round_sanity[floor] 839s 0.13s call tests/scalar/timedelta/methods/test_round.py::TestTimedeltaRound::test_round_sanity[ceil] 839s 0.03s teardown tests/scalar/timestamp/test_timezones.py::TestTimestampTZOperations::test_timestamp_timetz_equivalent_with_datetime_tz[zoneinfo.ZoneInfo(key='UTC')] 839s 0.02s call tests/scalar/timestamp/test_timestamp.py::test_negative_dates 839s 0.01s call tests/scalar/test_na_scalar.py::test_arithmetic_ops[mul-1.0] 839s 0.01s call tests/scalar/test_nat.py::test_nat_vector_field_access 839s 0.01s call tests/scalar/period/test_period.py::TestPeriodMethods::test_to_timestamp 839s 0.01s call tests/scalar/timestamp/methods/test_tz_localize.py::TestTimestampTZLocalize::test_tz_localize_ambiguous 839s 0.01s call tests/scalar/period/test_arithmetic.py::TestPeriodArithmetic::test_period_add_offset 839s 0.01s call tests/scalar/timedelta/test_timedelta.py::TestTimedeltas::test_implementation_limits 839s 0.01s call tests/scalar/timestamp/test_constructors.py::TestTimestampConstructorFoldKeyword::test_timestamp_constructor_fold_conflict[1572136200000000000-0] 839s 839s (13 durations < 0.005s hidden. Use -vv to show these durations.) 839s ============ 4316 passed, 8 skipped, 5 xfailed, 2 warnings in 8.91s ============ 839s rdjoqkol test state = false 839s + echo 'rdjoqkol test state = false' 839s + for TEST_SUBSET in $modpath/tests/* 839s + echo /usr/lib/python3/dist-packages/pandas/tests/series 839s + grep -q -e __pycache__ 839s + PANDAS_CI=1 839s + LC_ALL=C.UTF-8 839s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/series 843s ============================= test session starts ============================== 843s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 843s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 843s rootdir: /usr/lib/python3/dist-packages/pandas 843s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 843s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 843s asyncio: mode=Mode.STRICT 843s collected 13018 items / 2 skipped 843s 843s ../../../usr/lib/python3/dist-packages/pandas/tests/series/accessors/test_cat_accessor.py ................... 849s ../../../usr/lib/python3/dist-packages/pandas/tests/series/accessors/test_dt_accessor.py ........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 849s ../../../usr/lib/python3/dist-packages/pandas/tests/series/accessors/test_sparse_accessor.py . 849s ../../../usr/lib/python3/dist-packages/pandas/tests/series/accessors/test_str_accessor.py .. 849s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_datetime.py ................. 849s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_delitem.py .... 849s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_get.py ............ 849s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_getitem.py .............................................................................................. 850s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_indexing.py .......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 850s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_mask.py .... 850s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_set_value.py ... 854s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_setitem.py .......................................................................................................................sss....sss....sss....sss....sss....sss....sss....sss....sss....sss....sss....sss...s...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...sss.................................................................................................................................................................................sssssssss..................ssssssssssssssssss............................................................ssssssssssss..........................................................................................ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss...........................................................................................................................sss..................ssssssssssss...........................sss.......sssssssssssssssssssssssssss.................................................................................sssssssss.................................ssssss.....................sssssssss........................ssssss........................ssssssssssss....................................ssssssssssss..........................................ssssssssssssssssss................................................ssssssssssss....................................ssssssssssss.........................x........x........x........x........x........x........x........x........x.......sssssssssssssssssssssssssss.........sssssssss..............................ssssssssssss.................................sssssssss....................................ssssssssssssssssss................................................................................................................................................................................................... 854s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_take.py .... 854s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_where.py ......................................................................................................................................................................................................... 854s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_xs.py ...... 854s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_add_prefix_suffix.py ... 854s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_align.py ............................................................................................................... 854s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_argsort.py ......... 854s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_asof.py ....... 855s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_astype.py ......................................................s....s..........................................................................x........sssssssss.................s 855s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_autocorr.py . 855s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_between.py ....... 855s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_case_when.py ........... 855s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_clip.py ....s.....s.....s.....s.....s.....s.....s.....s.....s.....s............ 855s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_combine.py . 855s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_combine_first.py .............................. 855s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_compare.py ............ 856s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_convert_dtypes.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s......ss 856s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_copy.py .......... 856s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_count.py ... 856s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_cov_corr.py ................ 857s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_describe.py ...................................................... 857s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_diff.py ....... 857s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_drop.py ............................. 857s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_drop_duplicates.py ..................................................................ssssss.............................................................................................................................................................................................................................................................................................................ss 857s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_dropna.py ........... 857s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_dtypes.py . 857s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_duplicated.py .................. 857s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_equals.py ..................................................... 857s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_explode.py ...............ssss 858s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_fillna.py ....................x.x.x................................................................................................................................. 858s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_get_numeric_data.py . 858s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_head_tail.py . 858s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_infer_objects.py ....... 858s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_info.py ........x..... 858s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_interpolate.py x.........................................................................................................................................................................................................x.x............. 858s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_is_monotonic.py .. 858s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_is_unique.py ........ 859s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_isin.py ......................................... 859s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_isna.py .. 859s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_item.py . 859s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_map.py ......ss...................................xxx....................................................... 859s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_matmul.py . 859s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_nlargest.py ................................................................. 859s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_nunique.py .. 859s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_pct_change.py .............. 859s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_pop.py . 859s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_quantile.py ........................................ 860s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_rank.py .......................................................................................................ssssssssssssssssssssssssssssss........................................................ssssssssssssssss.....ss.....ss.....ss.....ss.....ss.....................................................ssssssssssssssssss.............................................ssssssssssssssssss.............................................ssssssssssssssssss.............................................ssssssssssssssssss.............................................ssssssssssssssssss. 860s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_reindex.py ................................... 860s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_reindex_like.py .. 860s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_rename.py ................ 860s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_rename_axis.py ..... 860s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_repeat.py ... 860s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_replace.py ...........................s.......................................................................... 860s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_reset_index.py ........s...... 860s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_round.py ......................................................................................... 860s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_searchsorted.py ........ 860s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_set_name.py .. 860s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_size.py ....... 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_sort_index.py .............................................. 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_sort_values.py .............. 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_to_csv.py ................................... 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_to_dict.py ...... 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_to_frame.py ... 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_to_numpy.py ...s. 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_tolist.py ..........sss 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_truncate.py .... 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_tz_localize.py ................................................................ 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_unique.py ....... 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_unstack.py ....... 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_update.py ....................s..... 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_value_counts.py ................... 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_values.py ... 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_view.py .................................................. 861s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_api.py ................................s.......................................................................................................................................................... 867s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_arithmetic.py ..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x...............................x............. 868s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_constructors.py ......................................................................................................................................................................................................................................x.........x............................................................................................s.................................xx.................................................sssssss.s..........ss.......... 868s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_cumulative.py ....................................... 869s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_formats.py .................................................. 869s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_iteration.py ....... 869s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_logical_ops.py ..........................xs 869s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_missing.py ...x.. 869s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_npfuncs.py ....s 869s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_reductions.py ..............s............... 869s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_subclass.py ......... 870s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_ufunc.py .....................................................................xxxx........................................................................................................................................ 870s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_unary.py .......................... 870s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_validate.py ............................ 870s 870s =============================== warnings summary =============================== 870s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 870s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-gnl17jro' 870s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 870s 870s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 870s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-kv51a320' 870s session.config.cache.set(STEPWISE_CACHE_DIR, []) 870s 870s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 870s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 870s ============================= slowest 30 durations ============================= 870s 0.34s call tests/series/test_formats.py::TestSeriesRepr::test_latex_repr 870s 0.25s call tests/series/methods/test_rank.py::test_pct_max_many_rows 870s 0.24s call tests/series/methods/test_cov_corr.py::TestSeriesCorr::test_corr[float64] 870s 0.18s call tests/series/test_arithmetic.py::TestNamePreservation::test_series_ops_name_retention[numexpr-names2-add-True-tuple] 870s 0.10s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[en_NG.UTF-8] 870s 0.07s call tests/series/test_api.py::TestSeriesMisc::test_inspect_getmembers 870s 0.05s teardown tests/series/test_validate.py::test_validate_bool_args[5.0-drop_duplicates] 870s 0.05s call tests/series/methods/test_isin.py::TestSeriesIsIn::test_isin 870s 0.02s call tests/series/accessors/test_cat_accessor.py::TestCatAccessor::test_dt_accessor_api_for_categorical[idx1] 870s 0.01s call tests/series/accessors/test_cat_accessor.py::TestCatAccessor::test_dt_accessor_api_for_categorical[idx0] 870s 0.01s call tests/series/methods/test_to_csv.py::TestSeriesToCSV::test_from_csv 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[shn_MM.UTF-8] 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_ambiguous_freq_conversions 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[el_GR.ISO8859-7] 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[byn_ER.UTF-8] 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[de_IT.UTF-8] 870s 0.01s call tests/series/methods/test_reset_index.py::TestResetIndex::test_reset_index_level 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[ckb_IQ.UTF-8] 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[aa_ET.UTF-8] 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[cy_GB.UTF-8] 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[hu_HU.UTF-8] 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[gez_ET.UTF-8@abegede] 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[ar_LB.UTF-8] 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[am_ET.UTF-8] 870s 0.01s call tests/series/test_logical_ops.py::TestSeriesLogicalOps::test_logical_ops_label_based 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[el_CY.UTF-8] 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[ar_SA.UTF-8] 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[el_CY.ISO8859-7] 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[aa_ER.UTF-8] 870s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[aa_DJ.UTF-8] 870s ========= 12380 passed, 608 skipped, 32 xfailed, 2 warnings in 29.42s ========== 871s rdjoqkol test state = false 871s + echo 'rdjoqkol test state = false' 871s + for TEST_SUBSET in $modpath/tests/* 871s + echo /usr/lib/python3/dist-packages/pandas/tests/strings 871s + grep -q -e __pycache__ 871s + PANDAS_CI=1 871s + LC_ALL=C.UTF-8 871s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/strings 873s ============================= test session starts ============================== 873s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 873s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 873s rootdir: /usr/lib/python3/dist-packages/pandas 873s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 873s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 873s asyncio: mode=Mode.STRICT 873s collected 3604 items 873s 877s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_api.py ..ss.................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x.........................................................................................................xx............xx......xx............................xx........................xxxx..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss 877s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_case_justify.py ..ss...ss.......ssssss...ss....ss..ss...ss..ss..........ssssssssss..ss...ss..ss..ss..ss..ss 877s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_cat.py .......s.s...s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s...................................... 878s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_extract.py ..ss..ss....ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ssssss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss...ss..ss..ss..ss....ssss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ssssss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss....ssss..ss..ss................ssssssssssssssss..ss..ss...ss..sss 878s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_find_replace.py ..ss..............ssssssssssss..ss..ss...........................sss...........................sss..ss..ss...ss........................ssssssssssssssssssssssss..ss......ssssss..ss..ss...ss..ss..ss....ssss..ss..ss..ss..ss..ss....ssss..ss...ss..ss..ss..ss..ss..ss..ss...ss..ss..ss.s.sssss.s.sssss...ss 878s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_get_dummies.py ..ss...ss. 878s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_split_partition.py ....ssss....ssss..ss..ss..ss............ssssssss..ss..ss..ss..ss....ssss............ssssssssssss....ssss..ss..ss..ss..ss.....ss..ss..ss.....ss..ss.....ssss....ssss....ssss....ssss....ssss......ssss....ssss..ss..ss......ssss...........ss 879s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_string_array.py ...............................................................................sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss....ssss....ssss.s 879s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_strings.py ......ss...ss.....ssss..ss................ssssssssssssssss....ssss....ssss..ss...ss.............ssssssssssss............ssssssssssss..ss..ss....ssss....ssss....ssss..ss..........ssssssssss..................ssssssssssssssss......ssssss.........ssssss....ssss....ssss..ss...ss..ss..ss.....ssss..ss....................ss....... 879s 879s =============================== warnings summary =============================== 879s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 879s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-u21grk07' 879s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 879s 879s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 879s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-_e8mhfou' 879s session.config.cache.set(STEPWISE_CACHE_DIR, []) 879s 879s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 879s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 879s ============================= slowest 30 durations ============================= 879s 0.10s call tests/strings/test_strings.py::test_isnumeric_unicode_missing[object-isnumeric-expected0] 879s 0.01s call tests/strings/test_strings.py::test_empty_str_methods[string[python]] 879s 0.01s teardown tests/strings/test_strings.py::test_series_str_decode 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data6-names6] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data7-names7] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data5-names5] 879s 0.01s call tests/strings/test_strings.py::test_empty_str_methods[object] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data3-names3] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data2-names2] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data5-names5] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data6-names6] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data7-names7] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data3-names3] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data2-names2] 879s 0.01s call tests/strings/test_extract.py::test_extract_expand_capture_groups[string[python]] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data4-names4] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data0-names0] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data1-names1] 879s 0.01s call tests/strings/test_extract.py::test_extractall_same_as_extract_subject_index[string[python]] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data0-names0] 879s 0.01s call tests/strings/test_extract.py::test_extractall_same_as_extract_subject_index[object] 879s 0.01s call tests/strings/test_extract.py::test_extractall[string[python]] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data4-names4] 879s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data1-names1] 879s 0.01s call tests/strings/test_extract.py::test_extract_expand_capture_groups[object] 879s 0.01s call tests/strings/test_extract.py::test_extractall[object] 879s 0.01s call tests/strings/test_extract.py::test_extractall_same_as_extract[string[python]] 879s 0.01s call tests/strings/test_extract.py::test_extractall_same_as_extract[object] 879s 0.01s call tests/strings/test_cat.py::test_str_cat_mixed_inputs[index] 879s 879s (1 durations < 0.005s hidden. Use -vv to show these durations.) 879s ========== 2704 passed, 887 skipped, 13 xfailed, 2 warnings in 6.10s =========== 879s + echo 'rdjoqkol test state = false' 879s rdjoqkol test state = false 879s + for TEST_SUBSET in $modpath/tests/* 879s + echo /usr/lib/python3/dist-packages/pandas/tests/test_aggregation.py 879s + grep -q -e __pycache__ 879s + PANDAS_CI=1 879s + LC_ALL=C.UTF-8 879s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_aggregation.py 881s ============================= test session starts ============================== 881s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 881s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 881s rootdir: /usr/lib/python3/dist-packages/pandas 881s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 881s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 881s asyncio: mode=Mode.STRICT 881s collected 8 items 881s 881s ../../../usr/lib/python3/dist-packages/pandas/tests/test_aggregation.py ........ 881s 881s =============================== warnings summary =============================== 881s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 881s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-hb1l1wy6' 881s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 881s 881s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 881s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-k8gqs9gh' 881s session.config.cache.set(STEPWISE_CACHE_DIR, []) 881s 881s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 881s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 881s ============================= slowest 30 durations ============================= 881s 881s (24 durations < 0.005s hidden. Use -vv to show these durations.) 881s ======================== 8 passed, 2 warnings in 0.10s ========================= 881s + echo 'rdjoqkol test state = false' 881s + for TEST_SUBSET in $modpath/tests/* 881s rdjoqkol test state = false 881s + echo /usr/lib/python3/dist-packages/pandas/tests/test_algos.py 881s + grep -q -e __pycache__ 881s + PANDAS_CI=1 881s + LC_ALL=C.UTF-8 881s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_algos.py 883s ============================= test session starts ============================== 883s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 883s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 883s rootdir: /usr/lib/python3/dist-packages/pandas 883s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 883s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 883s asyncio: mode=Mode.STRICT 883s collected 463 items 883s 885s ../../../usr/lib/python3/dist-packages/pandas/tests/test_algos.py ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 885s 885s =============================== warnings summary =============================== 885s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 885s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-ib2x7zba' 885s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 885s 885s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 885s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-t1wbuddw' 885s session.config.cache.set(STEPWISE_CACHE_DIR, []) 885s 885s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 885s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 885s ============================= slowest 30 durations ============================= 885s 1.13s call tests/test_algos.py::TestRank::test_pct_max_many_rows 885s 0.27s call tests/test_algos.py::TestRank::test_scipy_compat[arr0] 885s 0.02s call tests/test_algos.py::TestIsin::test_large 885s 0.01s call tests/test_algos.py::TestDuplicated::test_datetime_likes 885s 0.01s call tests/test_algos.py::TestUnique::test_object_refcount_bug 885s 0.01s call tests/test_algos.py::TestIsin::test_same_nan_is_in_large_series 885s 885s (24 durations < 0.005s hidden. Use -vv to show these durations.) 885s ======================= 463 passed, 2 warnings in 2.30s ======================== 886s rdjoqkol test state = false 886s + echo 'rdjoqkol test state = false' 886s + for TEST_SUBSET in $modpath/tests/* 886s + echo /usr/lib/python3/dist-packages/pandas/tests/test_common.py 886s + grep -q -e __pycache__ 886s + PANDAS_CI=1 886s + LC_ALL=C.UTF-8 886s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_common.py 887s ============================= test session starts ============================== 887s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 887s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 887s rootdir: /usr/lib/python3/dist-packages/pandas 887s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 887s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 887s asyncio: mode=Mode.STRICT 887s collected 128 items 887s 890s ../../../usr/lib/python3/dist-packages/pandas/tests/test_common.py ...............x.x.............................................................................................................. 890s 890s =============================== warnings summary =============================== 890s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 890s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-p3f6al8u' 890s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 890s 890s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 890s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-2jzryyw7' 890s session.config.cache.set(STEPWISE_CACHE_DIR, []) 890s 890s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 890s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 890s ============================= slowest 30 durations ============================= 890s 1.29s call tests/test_common.py::test_bz2_missing_import 890s 1.27s call tests/test_common.py::test_str_size 890s 890s (28 durations < 0.005s hidden. Use -vv to show these durations.) 890s ================== 126 passed, 2 xfailed, 2 warnings in 2.85s ================== 890s + echo 'rdjoqkol test state = false' 890s rdjoqkol test state = false 890s + for TEST_SUBSET in $modpath/tests/* 890s + echo /usr/lib/python3/dist-packages/pandas/tests/test_downstream.py 890s + grep -q -e __pycache__ 890s + PANDAS_CI=1 890s + LC_ALL=C.UTF-8 890s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_downstream.py 892s ============================= test session starts ============================== 892s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 892s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 892s rootdir: /usr/lib/python3/dist-packages/pandas 892s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 892s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 892s asyncio: mode=Mode.STRICT 892s collected 26 items 892s 895s ../../../usr/lib/python3/dist-packages/pandas/tests/test_downstream.py XXX.s..sssss.s..........s. 896s 896s =============================== warnings summary =============================== 896s tests/test_downstream.py::test_dask 896s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:31: FutureWarning: 896s Dask dataframe query planning is disabled because dask-expr is not installed. 896s 896s You can install it with `pip install dask[dataframe]` or `conda install dask`. 896s This will raise in a future version. 896s 896s warnings.warn(msg, FutureWarning) 896s 896s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 896s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-njbewvwf' 896s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 896s 896s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 896s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-msl5u671' 896s session.config.cache.set(STEPWISE_CACHE_DIR, []) 896s 896s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 896s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 896s ============================= slowest 30 durations ============================= 896s 1.49s call tests/test_downstream.py::test_oo_optimizable 896s 1.48s call tests/test_downstream.py::test_oo_optimized_datetime_index_unpickle 896s 0.10s call tests/test_downstream.py::test_dask 896s 0.01s call tests/test_downstream.py::test_construct_dask_float_array_int_dtype_match_ndarray 896s 0.01s call tests/test_downstream.py::test_yaml_dump 896s 896s (25 durations < 0.005s hidden. Use -vv to show these durations.) 896s ============= 15 passed, 8 skipped, 3 xpassed, 3 warnings in 3.59s ============= 896s rdjoqkol test state = false 896s + echo 'rdjoqkol test state = false' 896s + for TEST_SUBSET in $modpath/tests/* 896s + echo /usr/lib/python3/dist-packages/pandas/tests/test_errors.py 896s + grep -q -e __pycache__ 896s + PANDAS_CI=1 896s + LC_ALL=C.UTF-8 896s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_errors.py 897s ============================= test session starts ============================== 897s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 897s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 897s rootdir: /usr/lib/python3/dist-packages/pandas 897s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 897s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 897s asyncio: mode=Mode.STRICT 897s collected 36 items 897s 897s ../../../usr/lib/python3/dist-packages/pandas/tests/test_errors.py .................................... 897s 897s =============================== warnings summary =============================== 897s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 897s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-22vhnq2c' 897s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 897s 897s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 897s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-egvcccvk' 897s session.config.cache.set(STEPWISE_CACHE_DIR, []) 897s 897s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 897s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 897s ============================= slowest 30 durations ============================= 897s 897s (30 durations < 0.005s hidden. Use -vv to show these durations.) 897s ======================== 36 passed, 2 warnings in 0.12s ======================== 897s rdjoqkol test state = false 897s + echo 'rdjoqkol test state = false' 897s + for TEST_SUBSET in $modpath/tests/* 897s + echo /usr/lib/python3/dist-packages/pandas/tests/test_expressions.py 897s + grep -q -e __pycache__ 897s + PANDAS_CI=1 897s + LC_ALL=C.UTF-8 897s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_expressions.py 899s ============================= test session starts ============================== 899s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 899s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 899s rootdir: /usr/lib/python3/dist-packages/pandas 899s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 899s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 899s asyncio: mode=Mode.STRICT 899s collected 243 items 899s 900s ../../../usr/lib/python3/dist-packages/pandas/tests/test_expressions.py ................................................................................................................................................................................................................................................... 900s 900s =============================== warnings summary =============================== 900s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 900s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-hnrwedf6' 900s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 900s 900s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 900s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-66oyf3zn' 900s session.config.cache.set(STEPWISE_CACHE_DIR, []) 900s 900s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 900s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 900s ============================= slowest 30 durations ============================= 900s 0.01s call tests/test_expressions.py::TestExpressions::test_invalid 900s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[floordiv-False-_mixed] 900s 0.01s call tests/test_expressions.py::TestExpressions::test_bool_ops_warn_on_arithmetic[*-mul] 900s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[floordiv-False-_integer_integers] 900s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[mod-True-_mixed] 900s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[floordiv-True-_mixed] 900s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[mod-False-_mixed] 900s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[le-False-_mixed] 900s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[eq-True-_mixed] 900s 900s (21 durations < 0.005s hidden. Use -vv to show these durations.) 900s ======================= 243 passed, 2 warnings in 0.98s ======================== 900s + echo 'rdjoqkol test state = false' 900s rdjoqkol test state = false 900s + for TEST_SUBSET in $modpath/tests/* 900s + echo /usr/lib/python3/dist-packages/pandas/tests/test_flags.py 900s + grep -q -e __pycache__ 900s + PANDAS_CI=1 900s + LC_ALL=C.UTF-8 900s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_flags.py 902s ============================= test session starts ============================== 902s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 902s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 902s rootdir: /usr/lib/python3/dist-packages/pandas 902s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 902s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 902s asyncio: mode=Mode.STRICT 902s collected 5 items 902s 902s ../../../usr/lib/python3/dist-packages/pandas/tests/test_flags.py ..... 902s 902s =============================== warnings summary =============================== 902s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 902s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-l64a_wiz' 902s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 902s 902s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 902s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-x17e3uz6' 902s session.config.cache.set(STEPWISE_CACHE_DIR, []) 902s 902s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 902s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 902s ============================= slowest 30 durations ============================= 902s 902s (15 durations < 0.005s hidden. Use -vv to show these durations.) 902s ======================== 5 passed, 2 warnings in 0.09s ========================= 902s rdjoqkol test state = false 902s + echo 'rdjoqkol test state = false' 902s + for TEST_SUBSET in $modpath/tests/* 902s + echo /usr/lib/python3/dist-packages/pandas/tests/test_multilevel.py 902s + grep -q -e __pycache__ 902s + PANDAS_CI=1 902s + LC_ALL=C.UTF-8 902s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_multilevel.py 903s ============================= test session starts ============================== 903s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 903s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 903s rootdir: /usr/lib/python3/dist-packages/pandas 903s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 903s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 903s asyncio: mode=Mode.STRICT 903s collected 19 items 903s 904s ../../../usr/lib/python3/dist-packages/pandas/tests/test_multilevel.py ................... 904s 904s =============================== warnings summary =============================== 904s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 904s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-yusyx6wx' 904s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 904s 904s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 904s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-eh9w8dsj' 904s session.config.cache.set(STEPWISE_CACHE_DIR, []) 904s 904s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 904s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 904s ============================= slowest 30 durations ============================= 904s 0.01s call tests/test_multilevel.py::TestMultiLevel::test_reindex_level 904s 0.01s call tests/test_multilevel.py::TestMultiLevel::test_alignment 904s 904s (28 durations < 0.005s hidden. Use -vv to show these durations.) 904s ======================== 19 passed, 2 warnings in 0.19s ======================== 904s rdjoqkol test state = false 904s + echo 'rdjoqkol test state = false' 904s + for TEST_SUBSET in $modpath/tests/* 904s + echo /usr/lib/python3/dist-packages/pandas/tests/test_nanops.py 904s + grep -q -e __pycache__ 904s + PANDAS_CI=1 904s + LC_ALL=C.UTF-8 904s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_nanops.py 905s ============================= test session starts ============================== 905s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 905s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 905s rootdir: /usr/lib/python3/dist-packages/pandas 905s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 905s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 905s asyncio: mode=Mode.STRICT 905s collected 245 items 905s 906s ../../../usr/lib/python3/dist-packages/pandas/tests/test_nanops.py ..................................................................................................................................................................................................................................................... 906s 906s =============================== warnings summary =============================== 906s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 906s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-5fw4rhtf' 906s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 906s 906s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 906s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-jt0ur2n_' 906s session.config.cache.set(STEPWISE_CACHE_DIR, []) 906s 906s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 906s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 906s ============================= slowest 30 durations ============================= 906s 0.27s call tests/test_nanops.py::TestnanopsDataFrame::test_nansem[True-0] 906s 0.02s call tests/test_nanops.py::TestnanopsDataFrame::test_nankurt[True] 906s 0.02s call tests/test_nanops.py::TestnanopsDataFrame::test_nanskew[True] 906s 0.02s call tests/test_nanops.py::TestnanopsDataFrame::test_nansem[True-1] 906s 0.02s call tests/test_nanops.py::TestnanopsDataFrame::test_nansem[True-2] 906s 0.02s call tests/test_nanops.py::TestnanopsDataFrame::test_nankurt[False] 906s 0.02s call tests/test_nanops.py::TestnanopsDataFrame::test_nanskew[False] 906s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanmedian[True] 906s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nancorr_spearman 906s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nansem[False-1] 906s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nansem[False-2] 906s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nansem[False-0] 906s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanmedian[False] 906s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanstd[True-0] 906s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanstd[True-1] 906s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanstd[True-2] 906s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanvar[True-0] 906s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanvar[True-2] 906s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanvar[True-1] 906s 906s (11 durations < 0.005s hidden. Use -vv to show these durations.) 906s ======================= 245 passed, 2 warnings in 0.98s ======================== 906s + echo 'rdjoqkol test state = false' 906s rdjoqkol test state = false 906s + for TEST_SUBSET in $modpath/tests/* 906s + echo /usr/lib/python3/dist-packages/pandas/tests/test_optional_dependency.py 906s + grep -q -e __pycache__ 906s + PANDAS_CI=1 906s + LC_ALL=C.UTF-8 906s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_optional_dependency.py 908s ============================= test session starts ============================== 908s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 908s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 908s rootdir: /usr/lib/python3/dist-packages/pandas 908s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 908s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 908s asyncio: mode=Mode.STRICT 908s collected 5 items 908s 908s ../../../usr/lib/python3/dist-packages/pandas/tests/test_optional_dependency.py ..... 908s 908s =============================== warnings summary =============================== 908s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 908s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-6417hnxh' 908s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 908s 908s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 908s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-m46pec37' 908s session.config.cache.set(STEPWISE_CACHE_DIR, []) 908s 908s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 908s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 908s ============================= slowest 30 durations ============================= 908s 908s (15 durations < 0.005s hidden. Use -vv to show these durations.) 908s ======================== 5 passed, 2 warnings in 0.09s ========================= 908s + echo 'rdjoqkol test state = false' 908s rdjoqkol test state = false 908s + for TEST_SUBSET in $modpath/tests/* 908s + echo /usr/lib/python3/dist-packages/pandas/tests/test_register_accessor.py 908s + grep -q -e __pycache__ 908s + PANDAS_CI=1 908s + LC_ALL=C.UTF-8 908s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_register_accessor.py 910s ============================= test session starts ============================== 910s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 910s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 910s rootdir: /usr/lib/python3/dist-packages/pandas 910s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 910s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 910s asyncio: mode=Mode.STRICT 910s collected 7 items 910s 910s ../../../usr/lib/python3/dist-packages/pandas/tests/test_register_accessor.py ....... 910s 910s =============================== warnings summary =============================== 910s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 910s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-xhr11rjr' 910s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 910s 910s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 910s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-pfieuwev' 910s session.config.cache.set(STEPWISE_CACHE_DIR, []) 910s 910s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 910s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 910s ============================= slowest 30 durations ============================= 910s 910s (21 durations < 0.005s hidden. Use -vv to show these durations.) 910s ======================== 7 passed, 2 warnings in 0.10s ========================= 910s rdjoqkol test state = false 910s + echo 'rdjoqkol test state = false' 910s + for TEST_SUBSET in $modpath/tests/* 910s + echo /usr/lib/python3/dist-packages/pandas/tests/test_sorting.py 910s + grep -q -e __pycache__ 910s + PANDAS_CI=1 910s + LC_ALL=C.UTF-8 910s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_sorting.py 912s ============================= test session starts ============================== 912s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 912s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 912s rootdir: /usr/lib/python3/dist-packages/pandas 912s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 912s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 912s asyncio: mode=Mode.STRICT 912s collected 54 items / 15 deselected / 39 selected 912s 912s ../../../usr/lib/python3/dist-packages/pandas/tests/test_sorting.py ....................................... 912s 912s =============================== warnings summary =============================== 912s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 912s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-70__u2du' 912s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 912s 912s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 912s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-rbb5bonv' 912s session.config.cache.set(STEPWISE_CACHE_DIR, []) 912s 912s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 912s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 912s ============================= slowest 30 durations ============================= 912s 0.46s call tests/test_sorting.py::TestSorting::test_int64_overflow_groupby_large_range 912s 0.19s call tests/test_sorting.py::TestSorting::test_int64_overflow_groupby_large_df_shuffled[mean] 912s 0.18s call tests/test_sorting.py::TestSorting::test_int64_overflow_groupby_large_df_shuffled[median] 912s 0.01s call tests/test_sorting.py::TestMerge::test_int64_overflow_outer_merge 912s 912s (26 durations < 0.005s hidden. Use -vv to show these durations.) 912s ================ 39 passed, 15 deselected, 2 warnings in 0.98s ================= 913s rdjoqkol test state = false 913s + echo 'rdjoqkol test state = false' 913s + for TEST_SUBSET in $modpath/tests/* 913s + echo /usr/lib/python3/dist-packages/pandas/tests/test_take.py 913s + grep -q -e __pycache__ 913s + PANDAS_CI=1 913s + LC_ALL=C.UTF-8 913s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_take.py 914s ============================= test session starts ============================== 914s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 914s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 914s rootdir: /usr/lib/python3/dist-packages/pandas 914s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 914s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 914s asyncio: mode=Mode.STRICT 914s collected 81 items 914s 914s ../../../usr/lib/python3/dist-packages/pandas/tests/test_take.py ................................................................................. 914s 914s =============================== warnings summary =============================== 914s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 914s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-j8xollj2' 914s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 914s 914s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 914s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-j5bj65ae' 914s session.config.cache.set(STEPWISE_CACHE_DIR, []) 914s 914s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 914s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 914s ============================= slowest 30 durations ============================= 914s 914s (30 durations < 0.005s hidden. Use -vv to show these durations.) 914s ======================== 81 passed, 2 warnings in 0.19s ======================== 914s rdjoqkol test state = false 914s + echo 'rdjoqkol test state = false' 914s + for TEST_SUBSET in $modpath/tests/* 914s + echo /usr/lib/python3/dist-packages/pandas/tests/tools 914s + grep -q -e __pycache__ 914s + PANDAS_CI=1 914s + LC_ALL=C.UTF-8 914s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/tools 916s ============================= test session starts ============================== 916s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 916s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 916s rootdir: /usr/lib/python3/dist-packages/pandas 916s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 916s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 916s asyncio: mode=Mode.STRICT 916s collected 1510 items 916s 919s ../../../usr/lib/python3/dist-packages/pandas/tests/tools/test_to_datetime.py ............................................................................ss....................................................................................................................................................ssssssss....................................................................ss................................................................................................................................................................................................................................................................................xx....ss.ssssss....................................................s........................................................................................................................................................................................................................................................................................................ssssssssss............................ 919s ../../../usr/lib/python3/dist-packages/pandas/tests/tools/test_to_numeric.py ...s.s....................................................................................................................................................................................................................................................xx.......................................................................................................................ssssss.s.s.................................sss...sss.s....ssss.s.s 919s ../../../usr/lib/python3/dist-packages/pandas/tests/tools/test_to_time.py ........... 919s ../../../usr/lib/python3/dist-packages/pandas/tests/tools/test_to_timedelta.py ........................................................................ssssssssssss 919s 919s =============================== warnings summary =============================== 919s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 919s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-vxq9g8is' 919s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 919s 919s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 919s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-288a11fz' 919s session.config.cache.set(STEPWISE_CACHE_DIR, []) 919s 919s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 919s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 919s ============================= slowest 30 durations ============================= 919s 0.05s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[list-%Y%m%d %H:%M:%S-True] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[array-%Y%m%d %H:%M:%S-True] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[array-None-None] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[array-None-True] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[array-%Y%m%d %H:%M:%S-None] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[list-%Y%m%d %H:%M:%S-None] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[deque-None-None] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[deque-None-True] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[deque-%Y%m%d %H:%M:%S-True] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[deque-%Y%m%d %H:%M:%S-None] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[list-None-True] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[list-None-None] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[tuple-None-True] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[tuple-None-None] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[tuple-%Y%m%d %H:%M:%S-None] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[tuple-%Y%m%d %H:%M:%S-True] 919s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_error_iso_week_year[raise-Day of the year directive '%j' is not compatible with ISO year directive '%G'. Use '%Y' instead.-1999 50-%G %j] 919s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache_series[None-None] 919s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_fixed_offset 919s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[Index-None-True] 919s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[Index-None-None] 919s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[Index-%Y%m%d %H:%M:%S-True] 919s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[Index-%Y%m%d %H:%M:%S-None] 919s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache_series[None-True] 919s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache_series[%Y%m%d %H:%M:%S-None] 919s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache_series[%Y%m%d %H:%M:%S-True] 919s 0.01s call tests/tools/test_to_datetime.py::TestTimeConversionFormats::test_to_datetime_parse_tzname_or_tzoffset[%Y-%m-%d %H:%M:%S %Z-dates0-expected_dates0] 919s 0.01s call tests/tools/test_to_datetime.py::TestToDatetimeMisc::test_to_datetime_timezone_name 919s 0.01s call tests/tools/test_to_datetime.py::TestToDatetimeDataFrame::test_dataframe[True] 919s 0.01s teardown tests/tools/test_to_timedelta.py::test_from_timedelta_arrow_dtype[ms] 919s =========== 1440 passed, 66 skipped, 4 xfailed, 2 warnings in 3.47s ============ 920s rdjoqkol test state = false 920s + echo 'rdjoqkol test state = false' 920s + for TEST_SUBSET in $modpath/tests/* 920s + echo /usr/lib/python3/dist-packages/pandas/tests/tseries 920s + grep -q -e __pycache__ 920s + PANDAS_CI=1 920s + LC_ALL=C.UTF-8 920s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/tseries 922s ============================= test session starts ============================== 922s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 922s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 922s rootdir: /usr/lib/python3/dist-packages/pandas 922s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 922s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 922s asyncio: mode=Mode.STRICT 922s collected 5480 items 922s 922s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/frequencies/test_freq_code.py ................... 922s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/frequencies/test_frequencies.py .......... 924s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/frequencies/test_inference.py ......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 924s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/holiday/test_calendar.py ........ 924s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/holiday/test_federal.py ... 924s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/holiday/test_holiday.py ................................................. 924s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/holiday/test_observance.py ................................. 924s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_business_day.py ....................... 924s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_business_hour.py .............................................................................................. 924s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_business_month.py ..................... 924s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_business_quarter.py .............................................. 924s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_business_year.py ................... 925s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_common.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 925s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_custom_business_day.py ....... 925s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_custom_business_hour.py ............................ 925s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_custom_business_month.py .................................................... 925s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_dst.py .......................... 925s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_easter.py .......... 925s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_fiscal.py ............................................................................................................................................. 926s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_index.py ........................ 926s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_month.py ............................................................ 930s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_offsets.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x..................................................................................................................................................................................................................................................x................................................................................................................................................................................................................................................ 931s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_offsets_properties.py .. 931s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_quarter.py ........................................................................................ 933s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_ticks.py ............................................................................................................ 933s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_week.py .............................................. 933s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_year.py ................................. 933s 933s =============================== warnings summary =============================== 933s tests/tseries/offsets/test_offsets_properties.py::test_on_offset_implementations 933s /usr/lib/python3/dist-packages/dateutil/zoneinfo/__init__.py:26: UserWarning: I/O error(2): No such file or directory 933s warnings.warn("I/O error({0}): {1}".format(e.errno, e.strerror)) 933s 933s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 933s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-fi_aprer' 933s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 933s 933s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 933s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-u20vn6x3' 933s session.config.cache.set(STEPWISE_CACHE_DIR, []) 933s 933s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 933s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 933s ============================= slowest 30 durations ============================= 933s 0.46s call tests/tseries/offsets/test_offsets_properties.py::test_on_offset_implementations 933s 0.38s call tests/tseries/offsets/test_offsets_properties.py::test_shift_across_dst 933s 0.20s call tests/tseries/offsets/test_ticks.py::test_tick_equality[Second] 933s 0.20s call tests/tseries/offsets/test_ticks.py::test_tick_equality[Nano] 933s 0.20s call tests/tseries/offsets/test_ticks.py::test_tick_equality[Hour] 933s 0.20s call tests/tseries/offsets/test_ticks.py::test_tick_equality[Minute] 933s 0.20s call tests/tseries/offsets/test_ticks.py::test_tick_equality[Milli] 933s 0.20s call tests/tseries/offsets/test_ticks.py::test_tick_equality[Micro] 933s 0.09s call tests/tseries/offsets/test_custom_business_day.py::TestCustomBusinessDay::test_calendar 933s 0.09s call tests/tseries/offsets/test_custom_business_hour.py::TestCustomBusinessHour::test_us_federal_holiday_with_datetime 933s 0.08s call tests/tseries/offsets/test_ticks.py::test_tick_add_sub[Hour] 933s 0.08s call tests/tseries/offsets/test_ticks.py::test_tick_add_sub[Nano] 933s 0.08s call tests/tseries/offsets/test_ticks.py::test_tick_add_sub[Milli] 933s 0.08s call tests/tseries/offsets/test_ticks.py::test_tick_add_sub[Minute] 933s 0.08s call tests/tseries/offsets/test_ticks.py::test_tick_add_sub[Micro] 933s 0.08s call tests/tseries/offsets/test_common.py::test_apply_out_of_range[datetime.timezone(datetime.timedelta(days=-1, seconds=82800), 'foo')-LastWeekOfMonth] 933s 0.08s call tests/tseries/offsets/test_ticks.py::test_tick_add_sub[Second] 933s 0.03s teardown tests/tseries/offsets/test_year.py::test_add_out_of_pydatetime_range 933s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[BYearEnd--2] 933s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[BQuarterBegin--2] 933s 0.02s call tests/tseries/offsets/test_fiscal.py::TestFY5253LastOfMonthQuarter::test_offset 933s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[BYearBegin--2] 933s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[BusinessMonthBegin--2] 933s 0.02s call tests/tseries/offsets/test_business_month.py::test_apply_index[BusinessMonthBegin--2] 933s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[BQuarterEnd--2] 933s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[YearEnd--2] 933s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[MonthBegin--2] 933s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[QuarterBegin--2] 933s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[YearBegin--2] 933s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[QuarterEnd--2] 933s ================= 5478 passed, 2 xfailed, 3 warnings in 11.40s ================= 933s rdjoqkol test state = false 933s + echo 'rdjoqkol test state = false' 933s + for TEST_SUBSET in $modpath/tests/* 933s + grep -q -e __pycache__ 933s + echo /usr/lib/python3/dist-packages/pandas/tests/tslibs 933s + PANDAS_CI=1 933s + LC_ALL=C.UTF-8 933s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/tslibs 935s ============================= test session starts ============================== 935s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 935s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 935s rootdir: /usr/lib/python3/dist-packages/pandas 935s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 935s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 935s asyncio: mode=Mode.STRICT 935s collected 1139 items 935s 935s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_api.py . 935s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_array_to_datetime.py ............................................ 935s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_ccalendar.py ................. 937s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_conversion.py ...................................................................... 937s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_fields.py .... 937s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_libfrequencies.py ............ 937s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_liboffsets.py .......................................................................... 937s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_np_datetime.py ........ 937s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_npy_units.py .. 937s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_parse_iso8601.py ................................................... 944s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_parsing.py .................................ssssssssssssssssssssssssssssssssssssssss..sss...............................................................x...x................................. 944s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_period.py ....................................... 944s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_resolution.py ................... 944s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_strptime.py ....... 944s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_timedeltas.py ......................... 945s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_timezones.py ....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 945s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_to_offset.py ................................................................................................... 945s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_tzconversion.py . 945s 945s =============================== warnings summary =============================== 945s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 945s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-w6igsij0' 945s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 945s 945s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 945s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-e8awo4ta' 945s session.config.cache.set(STEPWISE_CACHE_DIR, []) 945s 945s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 945s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 945s ============================= slowest 30 durations ============================= 945s 0.22s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly['dateutil/Asia/Singapore'] 945s 0.22s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%m %Y-False-.] 945s 0.21s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%m %Y-True-.] 945s 0.17s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly[tzlocal()] 945s 0.17s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y %m %d-False-.] 945s 0.16s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y%m%d-True- ] 945s 0.16s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y %m %d-False-/] 945s 0.16s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%m %Y-False--] 945s 0.15s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%m %d %Y-False- ] 945s 0.15s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly['+01:15'] 945s 0.15s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly['-02:15'] 945s 0.15s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly['UTC-02:15'] 945s 0.15s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly[datetime.timezone(datetime.timedelta(days=-1, seconds=82800), 'foo')] 945s 0.14s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly['UTC+01:15'] 945s 0.14s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly[datetime.timezone(datetime.timedelta(seconds=3600))] 945s 0.13s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly[pytz.FixedOffset(-300)] 945s 0.13s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly[pytz.FixedOffset(300)] 945s 0.12s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%d %m %Y-True-/] 945s 0.11s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y%m%d-False-.] 945s 0.11s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y %m %d-True-.] 945s 0.11s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y %m %d-True-.] 945s 0.11s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y%m%d-True- ] 945s 0.11s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y %m %d-False- ] 945s 0.11s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y %m %d-True--] 945s 0.11s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y %m %d-True-/] 945s 0.11s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y %m %d-False- ] 945s 0.11s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y%m%d-False-/] 945s 0.11s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y %m %d-False-.] 945s 0.11s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y %m %d-True- ] 945s 0.11s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y %m %d-False-/] 945s =========== 1094 passed, 43 skipped, 2 xfailed, 2 warnings in 10.49s =========== 945s rdjoqkol test state = false 945s + echo 'rdjoqkol test state = false' 945s + for TEST_SUBSET in $modpath/tests/* 945s + echo /usr/lib/python3/dist-packages/pandas/tests/util 945s + grep -q -e __pycache__ 945s + PANDAS_CI=1 945s + LC_ALL=C.UTF-8 945s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/util 947s ============================= test session starts ============================== 947s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 947s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 947s rootdir: /usr/lib/python3/dist-packages/pandas 947s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 947s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 947s asyncio: mode=Mode.STRICT 947s collected 916 items 947s 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_almost_equal.py .................................................................................................................................................................... 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_attr_equal.py .......................................... 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_categorical_equal.py .......... 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_extension_array_equal.py ..................... 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_frame_equal.py ............................................................................................................... 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_index_equal.py ................................................................ 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_interval_array_equal.py ....... 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_numpy_array_equal.py ......................... 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_produces_warning.py ............................................................................................................................ 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_series_equal.py .............................................................................................. 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_deprecate.py ... 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_deprecate_kwarg.py .............. 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_deprecate_nonkeyword_arguments.py ................... 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_doc.py .... 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_hashing.py ..................................................................................................................................................... 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_numba.py . 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_rewrite_warning.py .......... 948s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_shares_memory.py .s 949s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_show_versions.py .... 949s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_util.py ...sx.. 949s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_validate_args.py ...... 949s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_validate_args_and_kwargs.py ...... 949s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_validate_inclusive.py ........... 949s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_validate_kwargs.py .................. 949s 949s =============================== warnings summary =============================== 949s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 949s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-avi0ge61' 949s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 949s 949s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 949s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-8pith6oa' 949s session.config.cache.set(STEPWISE_CACHE_DIR, []) 949s 949s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 949s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 949s ============================= slowest 30 durations ============================= 949s 0.73s call tests/util/test_show_versions.py::test_show_versions 949s 0.01s call tests/util/test_hashing.py::test_same_len_hash_collisions[0-7] 949s 0.01s call tests/util/test_hashing.py::test_same_len_hash_collisions[1-7] 949s 0.01s call tests/util/test_hashing.py::test_same_len_hash_collisions[0-6] 949s 0.01s call tests/util/test_show_versions.py::test_json_output_match 949s 0.01s call tests/util/test_show_versions.py::test_show_versions_console_json 949s 0.01s call tests/util/test_hashing.py::test_same_len_hash_collisions[1-6] 949s 949s (23 durations < 0.005s hidden. Use -vv to show these durations.) 949s ============ 913 passed, 2 skipped, 1 xfailed, 2 warnings in 2.17s ============= 949s + echo 'rdjoqkol test state = false' 949s + for TEST_SUBSET in $modpath/tests/* 949s + echo /usr/lib/python3/dist-packages/pandas/tests/window 949s + grep -q -e __pycache__ 949s rdjoqkol test state = false 949s + PANDAS_CI=1 949s + LC_ALL=C.UTF-8 949s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.13 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/window 952s ============================= test session starts ============================== 952s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 952s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 952s rootdir: /usr/lib/python3/dist-packages/pandas 952s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 952s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 952s asyncio: mode=Mode.STRICT 952s collected 10242 items / 536 deselected / 1 skipped / 9706 selected 952s 954s ../../../usr/lib/python3/dist-packages/pandas/tests/window/moments/test_moments_consistency_ewm.py ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 955s ../../../usr/lib/python3/dist-packages/pandas/tests/window/moments/test_moments_consistency_expanding.py ........x.......................x..x..x..x..x..x....................x.......................x..x..x..x..x..x................................................................................................................................................................................................................................................................................ 957s ../../../usr/lib/python3/dist-packages/pandas/tests/window/moments/test_moments_consistency_rolling.py ..............x..x............................................x..x..x..x..x..x..x..x..x..x..x..x......................................x..x............................................x..x..x..x..x..x..x..x..x..x..x..x................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 958s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_api.py ......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 958s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_apply.py ...s....sssss..........s..s....................................................... 959s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_base_indexer.py .................................................................................................................................................................................................................................... 959s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_cython_aggregations.py ........................................................................ 963s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_dtypes.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 963s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_ewm.py .......................................................................................................................................................................................................................................ssssssssssss........ssssssssssssssss................ 964s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_expanding.py ..........x................................................................................................................................................................................................ss....s...................s..s......s............................................................................................. 965s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_groupby.py ................................................................................................................... 965s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_numba.py ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 967s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_pairwise.py ........................................................................................................................................................................................................................................................................................................................ 969s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling.py ........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 970s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_functions.py .................................................................................................................................................................................................................................................................................................................................................................................................................................. 971s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_quantile.py .......................................................................................................................................................................................... 971s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_skew_kurt.py .................................................................... 971s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_timeseries_window.py ..................................................................................s 972s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_win_type.py ............................................................................................................................................................................................................................................................................................... 972s 972s =============================== warnings summary =============================== 972s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 972s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-io6ux965' 972s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 972s 972s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 972s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-ebnzv0_2' 972s session.config.cache.set(STEPWISE_CACHE_DIR, []) 972s 972s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 972s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 972s ============================= slowest 30 durations ============================= 972s 0.27s call tests/window/test_rolling_functions.py::test_rolling_functions_window_non_shrinkage[14] 972s 0.12s call tests/window/test_apply.py::test_time_rule_frame[False] 972s 0.08s call tests/window/test_apply.py::test_frame[False] 972s 0.05s call tests/window/test_expanding.py::test_expanding_corr_pairwise 972s 0.04s teardown tests/window/test_win_type.py::test_rolling_center_axis_1 972s 0.04s call tests/window/test_expanding.py::test_expanding_cov_pairwise 972s 0.03s call tests/window/test_apply.py::test_center_reindex_frame[False] 972s 0.03s call tests/window/test_apply.py::test_min_periods[False-None-0] 972s 0.03s call tests/window/test_apply.py::test_min_periods[False-1-0] 972s 0.03s call tests/window/test_apply.py::test_center_reindex_series[False] 972s 0.02s call tests/window/test_pairwise.py::test_rolling_pairwise_cov_corr[corr] 972s 0.02s call tests/window/test_pairwise.py::test_rolling_pairwise_cov_corr[cov] 972s 0.02s call tests/window/test_ewm.py::test_ewm_pairwise_cov_corr[corr] 972s 0.02s call tests/window/test_ewm.py::test_ewm_pairwise_cov_corr[cov] 972s 0.02s call tests/window/test_pairwise.py::TestPairwise::test_cov_mulittindex 972s 0.02s call tests/window/test_expanding.py::test_expanding_cov_pairwise_diff_length 972s 0.02s call tests/window/test_pairwise.py::test_flex_binary_frame[corr] 972s 0.02s call tests/window/test_apply.py::test_nans[False] 972s 0.02s call tests/window/test_pairwise.py::test_flex_binary_frame[cov] 972s 0.02s call tests/window/test_expanding.py::test_expanding_corr_pairwise_diff_length 972s 0.02s call tests/window/test_apply.py::test_min_periods[False-2-0] 972s 0.01s call tests/window/test_rolling.py::test_multi_index_names 972s 0.01s call tests/window/test_api.py::test_agg[None] 972s 0.01s call tests/window/test_api.py::test_agg[10] 972s 0.01s call tests/window/test_api.py::test_agg[5] 972s 0.01s call tests/window/test_api.py::test_agg[2] 972s 0.01s call tests/window/test_pairwise.py::TestPairwise::test_pairwise_with_other[pairwise_frames1-2] 972s 0.01s call tests/window/test_api.py::test_agg[1] 972s 0.01s call tests/window/test_pairwise.py::TestPairwise::test_pairwise_with_other[pairwise_frames1-1] 972s 0.01s call tests/window/test_pairwise.py::TestPairwise::test_pairwise_with_other[pairwise_frames4-3] 972s == 9014 passed, 650 skipped, 536 deselected, 43 xfailed, 2 warnings in 20.93s == 973s + echo 'rdjoqkol test state = false' 973s + for py in $pys 973s + echo '=== python3.12 ===' 973s rdjoqkol test state = false 973s === python3.12 === 973s ++ python3.12 -c 'import pandas as pd; print(pd.__path__[0])' 973s + modpath=/usr/lib/python3/dist-packages/pandas 973s + for TEST_SUBSET in $modpath/tests/* 973s + echo /usr/lib/python3/dist-packages/pandas/tests/__init__.py 973s + grep -q -e __pycache__ 973s + PANDAS_CI=1 973s + LC_ALL=C.UTF-8 973s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/__init__.py 974s ============================= test session starts ============================== 974s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 974s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 974s rootdir: /usr/lib/python3/dist-packages/pandas 974s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 974s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 974s asyncio: mode=Mode.STRICT 974s collected 0 items 974s 974s =============================== warnings summary =============================== 974s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 974s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-u_nnrzj6' 974s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 974s 974s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 974s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-rr0btdnd' 974s session.config.cache.set(STEPWISE_CACHE_DIR, []) 974s 974s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 974s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 974s ============================= 2 warnings in 0.09s ============================== 974s + test 5 == 5 974s + echo 'rdjoqkol test state = false' 974s + for TEST_SUBSET in $modpath/tests/* 974s + echo /usr/lib/python3/dist-packages/pandas/tests/__pycache__ 974s + grep -q -e __pycache__ 974s rdjoqkol test state = false 974s + echo 'rdjoqkol test state = false' 974s rdjoqkol test state = false 974s + for TEST_SUBSET in $modpath/tests/* 974s + echo /usr/lib/python3/dist-packages/pandas/tests/api 974s + grep -q -e __pycache__ 974s + PANDAS_CI=1 974s + LC_ALL=C.UTF-8 974s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/api 975s ============================= test session starts ============================== 975s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 975s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 975s rootdir: /usr/lib/python3/dist-packages/pandas 975s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 975s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 975s asyncio: mode=Mode.STRICT 975s collected 14 items 975s 975s ../../../usr/lib/python3/dist-packages/pandas/tests/api/test_api.py ............ 975s ../../../usr/lib/python3/dist-packages/pandas/tests/api/test_types.py .. 975s 975s =============================== warnings summary =============================== 975s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 975s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-vcrvgt6s' 975s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 975s 975s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 975s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-dpmwu_ii' 975s session.config.cache.set(STEPWISE_CACHE_DIR, []) 975s 975s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 975s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 975s ============================= slowest 30 durations ============================= 975s 975s (30 durations < 0.005s hidden. Use -vv to show these durations.) 975s ======================== 14 passed, 2 warnings in 0.12s ======================== 975s + echo 'rdjoqkol test state = false' 975s rdjoqkol test state = false 975s + for TEST_SUBSET in $modpath/tests/* 975s + echo /usr/lib/python3/dist-packages/pandas/tests/apply 975s + grep -q -e __pycache__ 975s + PANDAS_CI=1 975s + LC_ALL=C.UTF-8 975s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/apply 976s ============================= test session starts ============================== 976s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 976s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 976s rootdir: /usr/lib/python3/dist-packages/pandas 976s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 976s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 976s asyncio: mode=Mode.STRICT 976s collected 1243 items 976s 976s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_frame_apply.py .s....ssss........ss.s.s...............ss..ss.s..ss...................ssssssssssssssss.........................s..............................................s.s.s.s.............................................sss.s...........s.s..s...s....... 977s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_frame_apply_relabeling.py ..x.. 977s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_frame_transform.py ...s.s.s................................................ss..ss..ss.....x........x........x........ 977s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_invalid_arg.py ....................................................................................................................................................................................................... 978s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_numba.py sssssssssssssssssss 978s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_series_apply.py ................................x.....x....x........................................................................................ 978s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_series_apply_relabeling.py .. 978s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_series_transform.py ............ 980s ../../../usr/lib/python3/dist-packages/pandas/tests/apply/test_str.py ....................xxxxx...................................................................................................................................................................................................................................................................................................................................................................................................................................................................x...........x...........x...........x...........x........ 980s 980s =============================== warnings summary =============================== 980s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 980s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-4zasud2k' 980s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 980s 980s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 980s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-x6oohmed' 980s session.config.cache.set(STEPWISE_CACHE_DIR, []) 980s 980s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 980s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 980s ============================= slowest 30 durations ============================= 980s 0.04s call tests/apply/test_frame_apply.py::test_apply_differently_indexed 980s 0.04s call tests/apply/test_series_apply.py::test_apply_listlike_transformer[compat-ops3-names3] 980s 0.03s call tests/apply/test_frame_apply.py::test_agg_transform[axis=1] 980s 0.03s call tests/apply/test_frame_apply.py::test_agg_transform[axis='columns'] 980s 0.02s call tests/apply/test_frame_apply.py::test_agg_reduce[axis=1] 980s 0.02s call tests/apply/test_frame_apply.py::test_agg_reduce[axis='columns'] 980s 0.02s call tests/apply/test_frame_transform.py::test_transform_listlike[axis=1-ops3-names3] 980s 0.02s call tests/apply/test_frame_transform.py::test_transform_listlike[axis='columns'-ops1-names1] 980s 0.02s call tests/apply/test_frame_transform.py::test_transform_listlike[axis=1-ops1-names1] 980s 0.02s call tests/apply/test_frame_transform.py::test_transform_listlike[axis='columns'-ops3-names3] 980s 0.01s call tests/apply/test_frame_transform.py::test_transform_listlike[axis=1-ops0-names0] 980s 0.01s call tests/apply/test_frame_transform.py::test_transform_listlike[axis=1-ops2-names2] 980s 0.01s call tests/apply/test_frame_transform.py::test_transform_listlike[axis='columns'-ops0-names0] 980s 0.01s call tests/apply/test_frame_transform.py::test_transform_listlike[axis='columns'-ops2-names2] 980s 0.01s call tests/apply/test_series_apply.py::test_transform[False] 980s 0.01s call tests/apply/test_series_apply.py::test_transform[compat] 980s 0.01s call tests/apply/test_frame_apply_relabeling.py::test_agg_namedtuple 980s 0.01s call tests/apply/test_str.py::test_transform_groupby_kernel_frame[axis=1-pct_change] 980s 0.01s call tests/apply/test_str.py::test_transform_groupby_kernel_frame[axis='columns'-pct_change] 980s 0.01s call tests/apply/test_frame_apply.py::test_agg_reduce[axis='index'] 980s 0.01s call tests/apply/test_frame_apply.py::test_agg_reduce[axis=0] 980s 0.01s call tests/apply/test_frame_apply.py::test_agg_transform[axis=0] 980s 0.01s call tests/apply/test_frame_apply.py::test_agg_transform[axis='index'] 980s 0.01s call tests/apply/test_frame_apply_relabeling.py::test_agg_relabel 980s 0.01s call tests/apply/test_frame_apply.py::test_apply_mutating 980s 0.01s call tests/apply/test_series_apply.py::test_with_nested_series[agg] 980s 0.01s call tests/apply/test_frame_apply_relabeling.py::test_agg_relabel_partial_functions 980s 0.01s call tests/apply/test_frame_apply_relabeling.py::test_agg_relabel_multi_columns_multi_methods 980s 0.01s call tests/apply/test_series_apply.py::test_with_nested_series[apply] 980s 0.01s teardown tests/apply/test_str.py::test_transform_method_name[rank] 980s =========== 1153 passed, 73 skipped, 17 xfailed, 2 warnings in 4.80s =========== 980s + echo 'rdjoqkol test state = false' 980s + for TEST_SUBSET in $modpath/tests/* 980s rdjoqkol test state = false 980s + echo /usr/lib/python3/dist-packages/pandas/tests/arithmetic 980s + grep -q -e __pycache__ 980s + PANDAS_CI=1 980s + LC_ALL=C.UTF-8 980s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/arithmetic 982s ============================= test session starts ============================== 982s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 982s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 982s rootdir: /usr/lib/python3/dist-packages/pandas 982s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 982s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 982s asyncio: mode=Mode.STRICT 982s collected 19330 items 982s 982s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_array_ops.py .. 982s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_categorical.py .. 1024s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_datetime64.py .............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1025s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_interval.py ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1030s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_numeric.py ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......ss......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s..s.....s..s............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1030s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_object.py .......s........................................................................................... 1031s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_period.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1035s ../../../usr/lib/python3/dist-packages/pandas/tests/arithmetic/test_timedelta64.py .................................................................................................................................................................................................................................................................s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s.....s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s.....s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s.....s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s..s......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1035s 1035s =============================== warnings summary =============================== 1035s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1035s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-swf2ytf5' 1035s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1035s 1035s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1035s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-7lenk2_0' 1035s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1035s 1035s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1035s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1035s ============================= slowest 30 durations ============================= 1035s 0.26s call tests/arithmetic/test_numeric.py::TestNumericArithmeticUnsorted::test_binops_index[python-idx20-idx12-sub] 1035s 0.18s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[array-US/Central-ms-0-True-BQuarterBegin] 1035s 0.12s call tests/arithmetic/test_datetime64.py::TestDatetime64Arithmetic::test_dt64arr_addsub_intlike[index-tzlocal()-None-QE] 1035s 0.09s call tests/arithmetic/test_datetime64.py::TestDatetime64ArrayLikeComparisons::test_dt64arr_cmp_arraylike_invalid[pytz.FixedOffset(300)-DataFrame-other0] 1035s 0.08s teardown tests/arithmetic/test_timedelta64.py::test_add_timestamp_to_timedelta 1035s 0.04s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_relativedelta_offsets[DataFrame-s] 1035s 0.04s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_relativedelta_offsets[DataFrame-ms] 1035s 0.04s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_relativedelta_offsets[DataFrame-ns] 1035s 0.04s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_relativedelta_offsets[DataFrame-us] 1035s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-ms-5-True-cls_and_kwargs27] 1035s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-us-5-True-cls_and_kwargs27] 1035s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-s-5-True-cls_and_kwargs27] 1035s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-ms-5-False-cls_and_kwargs27] 1035s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-s-5-False-cls_and_kwargs27] 1035s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-us-5-False-cls_and_kwargs27] 1035s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-ns-5-True-cls_and_kwargs27] 1035s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-ns-5-False-cls_and_kwargs27] 1035s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-s-5-True-cls_and_kwargs27] 1035s 0.03s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-s-5-False-cls_and_kwargs27] 1035s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-ms-5-True-cls_and_kwargs27] 1035s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-ns-5-True-cls_and_kwargs27] 1035s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-us-5-False-cls_and_kwargs27] 1035s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-ms-5-False-cls_and_kwargs27] 1035s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-us-5-True-cls_and_kwargs27] 1035s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-None-ns-5-False-cls_and_kwargs27] 1035s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-ms-0-True-CBMonthBegin] 1035s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-ms-5-True-CBMonthBegin] 1035s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-us-0-True-CBMonthBegin] 1035s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-s-0-True-CBMonthBegin] 1035s 0.02s call tests/arithmetic/test_datetime64.py::TestDatetime64DateOffsetArithmetic::test_dt64arr_add_sub_DateOffsets[DataFrame-US/Central-us-5-True-CBMonthBegin] 1035s =============== 19158 passed, 172 skipped, 2 warnings in 53.87s ================ 1036s rdjoqkol test state = false 1036s + echo 'rdjoqkol test state = false' 1036s + for TEST_SUBSET in $modpath/tests/* 1036s + echo /usr/lib/python3/dist-packages/pandas/tests/arrays 1036s + grep -q -e __pycache__ 1036s + PANDAS_CI=1 1036s + LC_ALL=C.UTF-8 1036s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/arrays 1039s ============================= test session starts ============================== 1039s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1039s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1039s rootdir: /usr/lib/python3/dist-packages/pandas 1039s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1039s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1039s asyncio: mode=Mode.STRICT 1039s collected 19230 items / 2 skipped 1039s 1039s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_arithmetic.py ..................... 1039s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_astype.py ... 1039s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_comparison.py .................................... 1039s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_construction.py ............................. 1039s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_function.py ........... 1039s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_indexing.py ... 1039s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_logical.py ................................................................................... 1039s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_ops.py .. 1039s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_reduction.py .............................. 1039s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/boolean/test_repr.py . 1039s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_algos.py .............. 1039s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_analytics.py ........x..x................................................ 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_api.py ................................................................. 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_astype.py ...................... 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_constructors.py ....................................................................................................................... 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_dtypes.py .................................. 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_indexing.py ............................................................................................................................................... 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_map.py ............................. 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_missing.py .......................... 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_operators.py ...................................... 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_replace.py ...................... 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_repr.py ....................... 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_sorting.py .... 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_subclass.py ... 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_take.py ................ 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/categorical/test_warnings.py s 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/datetimes/test_constructors.py ......................ssssssssssssss 1040s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/datetimes/test_cumulative.py ... 1041s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/datetimes/test_reductions.py ................................................................................................................................................................................................................. 1041s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_arithmetic.py .............................................................. 1041s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_astype.py ......... 1041s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_comparison.py .................................................................................................................. 1041s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_concat.py ... 1041s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_construction.py ............................... 1041s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_contains.py . 1041s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_function.py .................................................... 1041s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_repr.py ........ 1041s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/floating/test_to_numpy.py .............................. 1042s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_arithmetic.py ................................................................................................................................................................................................................................................................................................................................................... 1042s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_comparison.py ......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1043s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_concat.py .................. 1043s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_construction.py ............................................... 1043s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_dtypes.py ......................................................................................................................... 1043s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_function.py ............................................................................................................................. 1043s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_indexing.py .. 1043s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_reduction.py ............................................... 1043s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/integer/test_repr.py .......................... 1043s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/interval/test_astype.py .. 1043s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/interval/test_formats.py . 1043s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/interval/test_interval.py .............................................................................. 1043s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/interval/test_interval_pyarrow.py ssssssss 1043s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/interval/test_overlaps.py .................................................................................................................... 1045s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/masked/test_arithmetic.py ..............................................................................................................................................ss........................................................................................................................................................ss........................................................................................................................................................ss........................................................................................................................................................ss........................................................................................................................................................ss............................................................................................................................................................................................................................................................................................................................................................... 1045s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/masked/test_function.py ..................... 1045s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/masked/test_indexing.py ........................................................................................................... 1045s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/numpy_/test_indexing.py ....................................... 1045s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/numpy_/test_numpy.py ....................................................................................... 1045s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/period/test_astype.py .......... 1045s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/period/test_constructors.py ..................... 1045s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/period/test_reductions.py ... 1045s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_accessor.py ............................................. 1047s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_arithmetics.py ...................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1047s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_array.py ........................................................................... 1047s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_astype.py ........................ 1047s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_combine_concat.py .......... 1047s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_constructors.py ................................. 1047s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_dtype.py ........................................................ 1048s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_indexing.py ................................................................................ 1048s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_libsparse.py ..................................................................................... 1048s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_reductions.py ....................................................................... 1048s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/sparse/test_unary.py ......... 1049s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/string_/test_string.py .ss.ss.ss.ss.ss.ss.ss.ss.ss.ssxssxss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.......ssss.ss.ss.ss.ss.ss.ss.ssxxssssxxssss....ssssssss....ssssssss.sssssssssssssssssssssss.ss.ss..ssss.ss...ssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss 1049s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/string_/test_string_arrow.py s.ss.ssssssssss.ssssssssssssssssssssssssssssssssss 1049s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/test_array.py ....................................................................................... 1065s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/test_datetimelike.py ........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ssssss....................................................................................................................................................................................................ssssssssssssssssssssssssssssssssssss.......................................................................................................................................................................................................................................................................................................................................... 1069s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/test_datetimes.py ........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 1069s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/test_ndarray_backed.py ..... 1069s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/test_period.py ................... 1069s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/test_timedeltas.py ...................................................................................................................................................... 1069s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/timedeltas/test_constructors.py ........ 1069s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/timedeltas/test_cumulative.py ..... 1069s ../../../usr/lib/python3/dist-packages/pandas/tests/arrays/timedeltas/test_reductions.py .......................... 1069s 1069s =============================== warnings summary =============================== 1069s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1069s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-zzdtxjoq' 1069s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1069s 1069s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1069s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-gjjl77ov' 1069s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1069s 1069s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1069s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1069s ============================= slowest 30 durations ============================= 1069s 0.23s call tests/arrays/test_datetimelike.py::TestDatetimeArray::test_bool_properties['UTC'-YE-is_year_end] 1069s 0.17s call tests/arrays/test_datetimelike.py::TestDatetimeArray::test_compare_categorical_dtype[False-zoneinfo.ZoneInfo(key='US/Pacific')-B-False-True] 1069s 0.14s call tests/arrays/sparse/test_dtype.py::test_equal[timedelta64[ns]-None] 1069s 0.11s call tests/arrays/floating/test_construction.py::test_to_array_bool[bool_values1-values1-Float64-expected_dtype1] 1069s 0.11s teardown tests/arrays/timedeltas/test_reductions.py::TestReductions::test_mean_2d 1069s 0.05s call tests/arrays/sparse/test_accessor.py::TestSeriesAccessor::test_from_coo 1069s 0.03s call tests/arrays/categorical/test_dtypes.py::TestCategoricalDtypes::test_codes_dtypes 1069s 0.02s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_mixed_array_comparison[integer] 1069s 0.02s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_mixed_array_comparison[block] 1069s 0.02s call tests/arrays/integer/test_arithmetic.py::test_values_multiplying_large_series_by_NA 1069s 0.02s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_float_scalar_comparison[block] 1069s 0.02s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_float_array_comparison[integer] 1069s 0.01s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_int_array_comparison[integer] 1069s 0.01s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_float_scalar_comparison[integer] 1069s 0.01s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_int_array_comparison[block] 1069s 0.01s call tests/arrays/sparse/test_arithmetics.py::TestSparseArrayArithmetics::test_float_array_comparison[block] 1069s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[float64-None-csc] 1069s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[float64-labels1-csr] 1069s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[float64-labels1-coo] 1069s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[float64-labels1-csc] 1069s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[float64-None-csr] 1069s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[float64-None-coo] 1069s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[int64-labels1-coo] 1069s 0.01s call tests/arrays/test_period.py::test_repr_large 1069s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[int64-None-coo] 1069s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[int64-None-csr] 1069s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[int64-labels1-csr] 1069s 0.01s call tests/arrays/sparse/test_accessor.py::TestFrameAccessor::test_from_spmatrix[int64-labels1-csc] 1069s 0.01s call tests/arrays/test_datetimelike.py::TestPeriodArray::test_median[D] 1069s 0.01s call tests/arrays/test_datetimelike.py::TestPeriodArray::test_median[B] 1069s ========= 18203 passed, 1021 skipped, 8 xfailed, 2 warnings in 32.27s ========== 1071s + echo 'rdjoqkol test state = false' 1071s rdjoqkol test state = false 1071s + for TEST_SUBSET in $modpath/tests/* 1071s + echo /usr/lib/python3/dist-packages/pandas/tests/base 1071s + grep -q -e __pycache__ 1071s + PANDAS_CI=1 1071s + LC_ALL=C.UTF-8 1071s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/base 1072s ============================= test session starts ============================== 1072s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1072s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1072s rootdir: /usr/lib/python3/dist-packages/pandas 1072s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1072s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1072s asyncio: mode=Mode.STRICT 1072s collected 1775 items 1072s 1072s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_constructors.py ....................... 1072s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_conversion.py ................................................................................................................................................................................................................................................................................................................................... 1072s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_fillna.py ..................................................................................ssssssssssssssssss....ssss........ssssssssss......ss..............................................ss......................ssssssssssss 1073s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_misc.py .......................................................................................................................................................................................................................................xx...xxx....................................................................s...... 1073s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_transpose.py .............................................................................................................................................................................................................................. 1073s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_unique.py ..................................................................................ssssssssssssssssss....ssss........ssssssssss......ss..............................................ss......................ssssssssssss..................................................................................ssssssssssssssssss....ssss..........ssssssss......ss......................................................................ssssssssssss.... 1074s ../../../usr/lib/python3/dist-packages/pandas/tests/base/test_value_counts.py ..................................................................................ssssssssssssssssss....ssss........ssssssssss......ss..............................................ss......................ssssssssssss......................... 1074s 1074s =============================== warnings summary =============================== 1074s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1074s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-1fymbbz4' 1074s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1074s 1074s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1074s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-xza_7jg_' 1074s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1074s 1074s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1074s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1074s ============================= slowest 30 durations ============================= 1074s 0.06s call tests/base/test_value_counts.py::test_value_counts_null[interval-nan] 1074s 0.04s call tests/base/test_unique.py::test_unique[interval] 1074s 0.02s call tests/base/test_value_counts.py::test_value_counts_null[interval-None] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts[interval] 1074s 0.01s call tests/base/test_unique.py::test_unique[period] 1074s 0.01s call tests/base/test_unique.py::test_unique_null[period-nan] 1074s 0.01s call tests/base/test_unique.py::test_unique_null[period-None] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts[datetime-tz] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[datetime-tz-None] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[datetime-tz-nan] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[period-nan] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts_bins[index] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[period-None] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts_bins[series] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts[period] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[timedelta-nan] 1074s 0.01s call tests/base/test_unique.py::test_unique[datetime-tz] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts[timedelta] 1074s 0.01s call tests/base/test_unique.py::test_unique_null[datetime-tz-None] 1074s 0.01s call tests/base/test_unique.py::test_unique_null[datetime-tz-nan] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[timedelta-None] 1074s 0.01s teardown tests/base/test_value_counts.py::test_value_counts_object_inference_deprecated 1074s 0.01s call tests/base/test_unique.py::test_unique[timedelta] 1074s 0.01s call tests/base/test_unique.py::test_unique_null[timedelta-nan] 1074s 0.01s call tests/base/test_unique.py::test_unique_null[timedelta-None] 1074s 0.01s call tests/base/test_unique.py::test_unique_null[datetime-nan] 1074s 0.01s call tests/base/test_unique.py::test_unique[datetime] 1074s 0.01s call tests/base/test_unique.py::test_unique_null[datetime-None] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[datetime-nan] 1074s 0.01s call tests/base/test_value_counts.py::test_value_counts_null[datetime-None] 1074s =========== 1581 passed, 189 skipped, 5 xfailed, 2 warnings in 3.02s =========== 1074s rdjoqkol test state = false 1074s + echo 'rdjoqkol test state = false' 1074s + for TEST_SUBSET in $modpath/tests/* 1074s + echo /usr/lib/python3/dist-packages/pandas/tests/computation 1074s + grep -q -e __pycache__ 1074s + PANDAS_CI=1 1074s + LC_ALL=C.UTF-8 1074s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/computation 1076s ============================= test session starts ============================== 1076s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1076s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1076s rootdir: /usr/lib/python3/dist-packages/pandas 1076s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1076s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1076s asyncio: mode=Mode.STRICT 1076s collected 11159 items 1076s 1076s ../../../usr/lib/python3/dist-packages/pandas/tests/computation/test_compat.py ..... 1118s ../../../usr/lib/python3/dist-packages/pandas/tests/computation/test_eval.py ..............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................xx..............................xx..............................xx..............................xx.......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................xxxxxxxxxx..................................................xxxxxxxxxx......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................X.........X....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x....................................................................................................................................................................................................................................xx..xx..... 1118s 1118s =============================== warnings summary =============================== 1118s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1118s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-298cmieo' 1118s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1118s 1118s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1118s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-c17oj35c' 1118s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1118s 1118s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1118s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1118s ============================= slowest 30 durations ============================= 1118s 0.08s call tests/computation/test_eval.py::TestEval::test_complex_cmp_ops[DataFrameNaN-DataFrameNaN-numexpr-python-&-lt-le] 1118s 0.06s call tests/computation/test_eval.py::TestEval::test_complex_cmp_ops[DataFrame-DataFrameNaN-python-python-and-lt-le] 1118s 0.06s call tests/computation/test_eval.py::TestAlignment::test_performance_warning_for_poor_alignment[numexpr-python] 1118s 0.05s call tests/computation/test_eval.py::TestAlignment::test_performance_warning_for_poor_alignment[numexpr-pandas] 1118s 0.05s teardown tests/computation/test_eval.py::TestValidate::test_validate_bool_args[5.0] 1118s 0.03s call tests/computation/test_eval.py::TestAlignment::test_performance_warning_for_poor_alignment[python-pandas] 1118s 0.03s call tests/computation/test_eval.py::TestAlignment::test_performance_warning_for_poor_alignment[python-python] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_simple_arith_ops[numexpr-python] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_lhs_expression_subscript 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_check_many_exprs[numexpr-python] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_nested_period_index_subscript_expression 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[numexpr-pandas-|] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[python-pandas-|] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_check_many_exprs[numexpr-pandas] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[numexpr-pandas-&] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[numexpr-python-|] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_check_many_exprs[python-python] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_check_many_exprs[python-pandas] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_simple_arith_ops[numexpr-pandas] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_attr_expression 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[python-pandas-&] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[numexpr-python-&] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_multi_line_expression 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[python-python-|] 1118s 0.01s call tests/computation/test_eval.py::TestOperations::test_fails_ampersand_pipe[python-python-&] 1118s 0.01s call tests/computation/test_eval.py::TestAlignment::test_medium_complex_frame_alignment[python-python-dt-dt-s-i] 1118s 0.01s call tests/computation/test_eval.py::TestAlignment::test_medium_complex_frame_alignment[python-pandas-dt-dt-i-i] 1118s 0.01s call tests/computation/test_eval.py::TestAlignment::test_medium_complex_frame_alignment[python-python-dt-dt-i-i] 1118s 0.01s call tests/computation/test_eval.py::TestAlignment::test_medium_complex_frame_alignment[python-pandas-dt-dt-s-i] 1118s 0.01s call tests/computation/test_eval.py::TestAlignment::test_medium_complex_frame_alignment[numexpr-pandas-dt-dt-s-i] 1118s ========== 11124 passed, 33 xfailed, 2 xpassed, 2 warnings in 43.04s =========== 1119s + echo 'rdjoqkol test state = false' 1119s + for TEST_SUBSET in $modpath/tests/* 1119s rdjoqkol test state = false 1119s + echo /usr/lib/python3/dist-packages/pandas/tests/config 1119s + grep -q -e __pycache__ 1119s + PANDAS_CI=1 1119s + LC_ALL=C.UTF-8 1119s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/config 1120s ============================= test session starts ============================== 1120s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1120s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1120s rootdir: /usr/lib/python3/dist-packages/pandas 1120s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1120s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1120s asyncio: mode=Mode.STRICT 1120s collected 50 items 1120s 1120s ../../../usr/lib/python3/dist-packages/pandas/tests/config/test_config.py ..................... 1120s ../../../usr/lib/python3/dist-packages/pandas/tests/config/test_localization.py ............................. 1120s 1120s =============================== warnings summary =============================== 1120s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1120s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-nz_ltm7d' 1120s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1120s 1120s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1120s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-ixmz6bp8' 1120s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1120s 1120s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1120s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1120s ============================= slowest 30 durations ============================= 1120s 0.01s call tests/config/test_localization.py::test_get_locales_prefix 1120s 1120s (29 durations < 0.005s hidden. Use -vv to show these durations.) 1120s ======================== 50 passed, 2 warnings in 0.48s ======================== 1120s + echo 'rdjoqkol test state = false' 1120s + for TEST_SUBSET in $modpath/tests/* 1120s rdjoqkol test state = false 1120s + echo /usr/lib/python3/dist-packages/pandas/tests/construction 1120s + grep -q -e __pycache__ 1120s + PANDAS_CI=1 1120s + LC_ALL=C.UTF-8 1120s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/construction 1121s ============================= test session starts ============================== 1121s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1121s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1121s rootdir: /usr/lib/python3/dist-packages/pandas 1121s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1121s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1121s asyncio: mode=Mode.STRICT 1121s collected 1 item 1121s 1121s ../../../usr/lib/python3/dist-packages/pandas/tests/construction/test_extract_array.py . 1121s 1121s =============================== warnings summary =============================== 1121s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1121s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-02kg91qv' 1121s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1121s 1121s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1121s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-lo62wvor' 1121s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1121s 1121s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1121s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1121s ============================= slowest 30 durations ============================= 1121s 1121s (3 durations < 0.005s hidden. Use -vv to show these durations.) 1121s ======================== 1 passed, 2 warnings in 0.10s ========================= 1121s + echo 'rdjoqkol test state = false' 1121s rdjoqkol test state = false 1121s + for TEST_SUBSET in $modpath/tests/* 1121s + echo /usr/lib/python3/dist-packages/pandas/tests/copy_view 1121s + grep -q -e __pycache__ 1121s + PANDAS_CI=1 1121s + LC_ALL=C.UTF-8 1121s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/copy_view 1123s ============================= test session starts ============================== 1123s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1123s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1123s rootdir: /usr/lib/python3/dist-packages/pandas 1123s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1123s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1123s asyncio: mode=Mode.STRICT 1123s collected 793 items 1123s 1123s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/index/test_datetimeindex.py ...... 1123s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/index/test_index.py ..................... 1123s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/index/test_periodindex.py .. 1123s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/index/test_timedeltaindex.py .. 1123s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_array.py ............. 1123s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_astype.py .....ss...s..........s.. 1123s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_chained_assignment_deprecation.py ............... 1123s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_clip.py ...... 1123s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_constructors.py ............................................................................ 1123s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_core_functionalities.py ....... 1123s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_functions.py .................... 1123s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_indexing.py ....................................................................................s.....s........................................................................................................................................ 1123s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_internals.py ..................... 1124s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_interp_fillna.py ...................................................... 1124s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_methods.py ........................................................................................................................................................................................................................................................ 1124s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_replace.py ........................................ 1124s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_setitem.py ......... 1124s ../../../usr/lib/python3/dist-packages/pandas/tests/copy_view/test_util.py .. 1124s 1124s =============================== warnings summary =============================== 1124s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1124s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-riuc1qm4' 1124s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1124s 1124s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1124s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-xmk1etex' 1124s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1124s 1124s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1124s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1124s ============================= slowest 30 durations ============================= 1124s 0.01s call tests/copy_view/test_internals.py::test_exponential_backoff 1124s 1124s (29 durations < 0.005s hidden. Use -vv to show these durations.) 1124s ================== 787 passed, 6 skipped, 2 warnings in 2.37s ================== 1124s + echo 'rdjoqkol test state = false' 1124s + for TEST_SUBSET in $modpath/tests/* 1124s + echo /usr/lib/python3/dist-packages/pandas/tests/dtypes 1124s + grep -q -e __pycache__ 1124s rdjoqkol test state = false 1124s + PANDAS_CI=1 1124s + LC_ALL=C.UTF-8 1124s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/dtypes 1126s ============================= test session starts ============================== 1126s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1126s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1126s rootdir: /usr/lib/python3/dist-packages/pandas 1126s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1126s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1126s asyncio: mode=Mode.STRICT 1126s collected 5628 items 1126s 1126s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_can_hold_element.py ........... 1126s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_construct_from_scalar.py .... 1126s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_construct_ndarray.py ....... 1126s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_construct_object_arr.py ....................................... 1126s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_dict_compat.py . 1126s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_downcast.py ................................... 1126s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_find_common_type.py .......................................................................................... 1126s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_infer_datetimelike.py ... 1126s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_infer_dtype.py .................................................................... 1126s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_maybe_box_native.py ................ 1129s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/cast/test_promote.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. 1130s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/test_common.py .............................................................................................................................................................................s...................................................................................................................................................................................................................................................................... 1130s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/test_concat.py .... 1130s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/test_dtypes.py ........................................................................................................................................................................................................................................................................................... 1130s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/test_generic.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1131s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/test_inference.py .......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ssssssss....................................................... 1132s ../../../usr/lib/python3/dist-packages/pandas/tests/dtypes/test_missing.py ..........................................................................................................xxxx............................................................................................................................................................................................................. 1132s 1132s =============================== warnings summary =============================== 1132s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1132s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-n586e4u0' 1132s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1132s 1132s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1132s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-_5_7x8g0' 1132s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1132s 1132s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1132s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1132s ============================= slowest 30 durations ============================= 1132s 0.07s setup tests/dtypes/test_common.py::test_dtype_equal[category-category-category-category] 1132s 0.07s call tests/dtypes/test_inference.py::TestInference::test_maybe_convert_objects_mixed_datetimes 1132s 0.05s setup tests/dtypes/cast/test_promote.py::test_maybe_promote_any_with_datetime64[uint16-pd.Timestamp] 1132s 0.05s call tests/dtypes/test_common.py::test_is_sparse[True] 1132s 0.02s teardown tests/dtypes/test_missing.py::TestIsValidNAForDtype::test_is_valid_na_for_dtype_categorical 1132s 1132s (25 durations < 0.005s hidden. Use -vv to show these durations.) 1132s ============ 5615 passed, 9 skipped, 4 xfailed, 2 warnings in 6.88s ============ 1132s + echo 'rdjoqkol test state = false' 1132s + for TEST_SUBSET in $modpath/tests/* 1132s rdjoqkol test state = false 1132s + echo /usr/lib/python3/dist-packages/pandas/tests/extension 1132s + grep -q -e __pycache__ 1132s + PANDAS_CI=1 1132s + LC_ALL=C.UTF-8 1132s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/extension 1135s ============================= test session starts ============================== 1135s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1135s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1135s rootdir: /usr/lib/python3/dist-packages/pandas 1135s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1135s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1135s asyncio: mode=Mode.STRICT 1135s collected 16808 items / 1 skipped 1135s 1135s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/array_with_attr/test_array_with_attr.py . 1138s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/decimal/test_decimal.py ssssssssssssssssssssssssssssssssssss....................................x.................................................................................................................................................................................................................................................................................................................................xx............................................................s............................xxxxxxxxss..............xxssxxss................................................xxx..................... 1149s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/json/test_json.py ssssssssssssssssssssssssssssssssssss.................................................................................ssssssssssssssssssssssss.............................................s.......................................................s............................................ss................................................................................xx.......................................................s............xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx........xxxx........xxxxxxs.xxxxxxxxxxxxxxxxxxxxx....xxxxxxx.xx.xxxxxx...xxxxx...x...xxxxxxxxxxxx. 1149s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/list/test_list.py . 1151s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_categorical.py ssssssssssssssssssssssssssssssssssss....................................x..................................................................................................ssssssssssssssssssssssss...................................s.........................................................................................................................ss.................................................................................................xx...........................................................s............x..sxx................x.............xxxxxssssssssssssssssssssssssssssssssssss.. 1151s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_common.py ............ 1153s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_datetime.py ........................ssss............................................x..............................................................................................ssssssssssssssssssssssss.................................................................s......................................................................................................................................................................................xx..............................................xx...........................................................s............................................ssss......... 1153s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_extension.py .............. 1154s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_interval.py ssssssssssssssssssssssssssssssssssss....................................x..................................................................................................ssssssssssssssssssssssss.................................................................s...............................................................................................................................ss......................................................xx..............................................xx............................................................s............x. 1170s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_masked.py ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss..................................................................................................................................................................................................................................................................................................................................................................................................x...x...x...x...x...x...x...x...x...x...x..............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ssssssssssssssssssss..ssssssssssssssssssss..........................................................................................................................................................................................................................................................................ssssssssssssssssssssss................................................................................................................................................................................ssssssssssssssssssssss......................ssssssssssssssssssssss..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................xxxxxxxxxxxxxxxxxxxxxx..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s.s.s.s.s.s.s.s.s.s.s..................................................................................s............................................................................................x...........sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 1175s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_numpy.py ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................xxxx...............................................................................................................s.s........................ss.x.x...x.x.x.x.x.xxx.x.x.............x.x.....................................................x.x................ssssssssssssssssssssssssssssssssssssssssssssssss.x.x.x.x.x..xx.x..xx..xx...xxx...xxx.x...x...x.x.x..............xx..xxssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.. 1177s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_period.py ..........................................ssss........ssss.....................................................................................x...x..................................................................................................................................................................................................ssssssssssssssssssssssssssssssssssssssssssssssss..................................................................................................................................ss........................................................................................................................................................................................................................................................................................................................................................................................................................................................................xxxx.......................................................................................................................s.s............................................................................ssss........ssss................ 1186s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_sparse.py ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss..ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss....................................ssss................................................................................................................................................................................................................................................................................................................................................................................................xxxx........................ssss..............................................................xxxx..........................................................................................................................................xxxxxxxxxxxxxx...x.x.x.xxxxxxxxxxxxxxxss........xxxxxxxxssxxssss.x.x.x.xxxxxxxxxssxxss..........s.s.s.s.s.s...s......xxx......xx........ss..ssssss................xsxsssssssssssssss..............ssssssssssssss..............xxxx....xx.x............................x..x.x......xx.xxx......xxxxxx. 1191s ../../../usr/lib/python3/dist-packages/pandas/tests/extension/test_string.py ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss...ssssss...ssssss...ssssss...ssssss.ss.ss.ss.ss.ss.ss.ss.ss...ssssss...ssssss...ssssss...ssssss.x..ssssssss.x..ssssssss....ssssssss....ssssssss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss....ssssssss....ssssssss........ssssssssssssssss........ssssssssssssssss.ss.ss.ss.ss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss...ssssss...ssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssssssss.ss.ss.......................................ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.......................................ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss........ssssssssssssssss........ssssssssssssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss....ssssssss....ssssssss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss..ssss..ssss........ssssssssssssssss........ssssssssssssssss.ss.ss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss...ssssss...ssssss....ssssssss....ssssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss......ssssssssssss......ssssssssssss.ss.ssssssssssssss.....ssssssssss.....ssssssssss.....ssssssssss.....ssssssssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss....ssssssss.ss.ss.ss.ss.ss.ss.ss.ss...ssssss...ssssss...ssssss...ssssss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss...ssssss...ssssss..ssss..ssssxxssssxxssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss..ssss..ssss..ssss..ssss.ss.ss.ss.ss.sssss.sssss.ss.ss.ss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss..ssss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.ss.ss.ss.ss 1191s 1191s =============================== warnings summary =============================== 1191s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1191s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-_tzzwfgo' 1191s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1191s 1191s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1191s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-y_5z7gn6' 1191s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1191s 1191s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1191s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1191s ============================= slowest 30 durations ============================= 1191s 0.29s setup tests/extension/test_string.py::TestStringArray::test_getitem_slice[False-python] 1191s 0.22s call tests/extension/test_period.py::TestPeriodArray::test_shift_fill_value[D] 1191s 0.19s setup tests/extension/test_masked.py::Test2DCompat::test_getitem_2d[Int64Dtype] 1191s 0.13s call tests/extension/test_masked.py::TestMaskedArrays::test_concat[Float64Dtype-True] 1191s 0.13s call tests/extension/test_categorical.py::TestCategorical::test_unstack[series-index3] 1191s 0.09s call tests/extension/decimal/test_decimal.py::TestDecimalArray::test_compare_array[ge] 1191s 0.07s teardown tests/extension/test_string.py::test_searchsorted_with_na_raises[False-False-pyarrow_numpy] 1191s 0.06s call tests/extension/test_sparse.py::TestSparseArray::test_unstack[0-series-index2] 1191s 0.05s call tests/extension/test_sparse.py::TestSparseArray::test_unstack[nan-series-index2] 1191s 0.05s call tests/extension/test_sparse.py::TestSparseArray::test_unstack[0-series-index3] 1191s 0.05s call tests/extension/test_sparse.py::TestSparseArray::test_unstack[nan-series-index3] 1191s 0.04s call tests/extension/test_interval.py::TestIntervalArray::test_unstack[series-index2] 1191s 0.04s call tests/extension/test_sparse.py::TestSparseArray::test_unstack[nan-frame-index2] 1191s 0.04s call tests/extension/test_string.py::TestStringArray::test_unstack[True-python-series-index2] 1191s 0.04s call tests/extension/test_interval.py::TestIntervalArray::test_unstack[frame-index2] 1191s 0.04s call tests/extension/test_sparse.py::TestSparseArray::test_unstack[0-frame-index2] 1191s 0.04s call tests/extension/decimal/test_decimal.py::TestDecimalArray::test_arith_series_with_array[__rpow__] 1191s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[Int16Dtype-series-index2] 1191s 0.04s call tests/extension/test_interval.py::TestIntervalArray::test_unstack[series-index3] 1191s 0.04s call tests/extension/test_numpy.py::TestNumpyExtensionArray::test_unstack[float-series-index2] 1191s 0.04s call tests/extension/test_string.py::TestStringArray::test_unstack[False-python-series-index2] 1191s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[Float32Dtype-series-index2] 1191s 0.04s call tests/extension/decimal/test_decimal.py::TestDecimalArray::test_unstack[series-index2] 1191s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[Int8Dtype-series-index2] 1191s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[BooleanDtype-series-index2] 1191s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[Int32Dtype-series-index2] 1191s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[Int64Dtype-series-index2] 1191s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[UInt32Dtype-series-index2] 1191s 0.04s call tests/extension/test_interval.py::TestIntervalArray::test_unstack[frame-index3] 1191s 0.04s call tests/extension/test_masked.py::TestMaskedArrays::test_unstack[UInt16Dtype-series-index2] 1191s ======== 12106 passed, 4371 skipped, 332 xfailed, 2 warnings in 58.37s ========= 1192s rdjoqkol test state = false 1192s + echo 'rdjoqkol test state = false' 1192s + for TEST_SUBSET in $modpath/tests/* 1192s + echo /usr/lib/python3/dist-packages/pandas/tests/frame 1192s + grep -q -e __pycache__ 1192s + PANDAS_CI=1 1192s + LC_ALL=C.UTF-8 1192s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/frame 1196s ============================= test session starts ============================== 1196s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1196s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1196s rootdir: /usr/lib/python3/dist-packages/pandas 1196s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1196s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1196s asyncio: mode=Mode.STRICT 1196s collected 11172 items / 433 deselected / 1 skipped / 10739 selected 1196s 1197s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/constructors/test_from_dict.py .............. 1197s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/constructors/test_from_records.py ........................... 1197s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_coercion.py .......x.x. 1197s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_delitem.py .... 1197s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_get.py .... 1197s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_get_value.py .. 1197s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_getitem.py ........................................ 1198s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_indexing.py ...................................................................................................................................................................................................................................sss................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. 1198s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_insert.py ....... 1198s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_mask.py ........... 1198s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_set_value.py ... 1199s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_setitem.py ........................................................................................................s..........................................................................................xxx...........................x..x..x..x........ 1199s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_take.py .... 1201s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_where.py ............................................................................................................................................. 1201s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/indexing/test_xs.py .............................. 1201s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_add_prefix_suffix.py ... 1202s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_align.py ........................................................................................................................................................................................................................................................................................................................................................................................................................ 1202s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_asfreq.py ........................................ 1202s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_asof.py ........... 1202s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_assign.py ..... 1203s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_astype.py .......................................................................................................................s....s........................................................................................................ss.....ssssssssss 1203s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_at_time.py ...................... 1203s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_between_time.py ss............................ 1203s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_clip.py ..................... 1203s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_combine.py ..... 1204s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_combine_first.py ..................................s............................... 1204s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_compare.py ......................... 1204s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_convert_dtypes.py ..ssss..ssss.sss.. 1204s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_copy.py ..... 1204s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_count.py .. 1204s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_cov_corr.py .............................................................s............ 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_describe.py ...............................................s 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_diff.py .............................................. 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_dot.py ................sss 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_drop.py ......................................................................... 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_drop_duplicates.py ..................................... 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_droplevel.py .. 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_dropna.py ................... 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_dtypes.py ........ 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_duplicated.py .......xxx........... 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_equals.py ... 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_explode.py ..................... 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_fillna.py ................................................................. 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_filter.py ........... 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_first_and_last.py ............. 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_first_valid_index.py ............... 1205s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_get_numeric_data.py .... 1207s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_head_tail.py .................................................................. 1207s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_infer_objects.py . 1207s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_info.py ..........................x......s...... 1208s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_interpolate.py ...................................................................ssssssssss 1208s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_is_homogeneous_dtype.py ....... 1208s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_isetitem.py ... 1208s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_isin.py ................. 1208s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_iterrows.py . 1208s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_join.py ...........s................... 1208s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_map.py ......................... 1208s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_matmul.py .. 1208s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_nlargest.py .........................................................................................................................................................................................................................................................X.....X.....X.....X.......X..... 1208s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_pct_change.py ............................. 1208s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_pipe.py ...... 1208s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_pop.py ... 1210s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_quantile.py ........................................................................xx..........xx..........xx..........xx.................. 1212s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_rank.py .........................................................................................................................ss 1212s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_reindex.py ................................................................................................................................................. 1212s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_reindex_like.py ..... 1212s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_rename.py ......................... 1212s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_rename_axis.py ......... 1212s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_reorder_levels.py ... 1213s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_replace.py ...................ss.................ssss....ssss................................................................................................................................................................. 1213s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_reset_index.py ................................................................................................................................ 1213s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_round.py ......... 1213s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_sample.py .......................................................... 1214s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_select_dtypes.py ..................................s...... 1214s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_set_axis.py .............. 1216s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_set_index.py ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. 1218s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_shift.py ...................................................................................x.x.x.x.x.xxxx........ 1218s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_size.py ..... 1218s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_sort_index.py ................................................................. 1219s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_sort_values.py ...................................................X...............................XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX..... 1219s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_swapaxes.py .... 1219s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_swaplevel.py . 1220s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_csv.py ............................................................................. 1220s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_dict.py ...................................................................................................... 1220s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_dict_of_blocks.py ... 1220s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_numpy.py .... 1220s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_period.py ...................................................................... 1220s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_records.py ................................... 1220s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_to_timestamp.py ...................................................................... 1220s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_transpose.py ................... 1221s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_truncate.py ........................................................................................ 1221s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_tz_convert.py ........... 1221s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_tz_localize.py ......... 1221s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_update.py .............. 1221s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_value_counts.py ................................. 1221s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/methods/test_values.py ............... 1221s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_alter_axes.py .. 1221s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_api.py ......................ss.......... 1223s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_arithmetic.py ..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x..........x................................................................................................................................ 1223s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_block_internals.py ................... 1226s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_constructors.py ..................................................................................................................................................................................................s.....................................................................................................................................................................................................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..............................ssssss................ssss..........sss...ss.......................................................s..........................................xxxx..xx..........xxxx..xx................................ 1226s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_cumulative.py ....... 1226s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_iteration.py .................... 1226s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_logical_ops.py ................. 1226s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_nonunique_indexes.py ................ 1226s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_npfuncs.py .... 1228s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_query_eval.py ..................ss..ss..ss.......................s.......................s..s......sssss.................................................s........................s..s.....sssss...............................................ss..ss......................ss.............................................................................s....ss. 1231s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_reductions.py ....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s....................................................................................................x.............x............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. 1231s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_repr.py ..........................................ssss................ 1237s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_stack_unstack.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1237s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_subclass.py .................................................... 1238s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_ufunc.py ....xx.........xxxxxxxx.xx....s. 1238s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_unary.py .................. 1238s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_validate.py ............................ 1238s 1238s =============================== warnings summary =============================== 1238s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1238s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-smb8s0av' 1238s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1238s 1238s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1238s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-mjw6vg8g' 1238s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1238s 1238s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1238s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1238s ============================= slowest 30 durations ============================= 1238s 1.53s call tests/frame/indexing/test_where.py::test_where_inplace_casting 1238s 1.13s call tests/frame/methods/test_rank.py::TestRank::test_pct_max_many_rows 1238s 0.45s call tests/frame/test_api.py::TestDataFrameMisc::test_inspect_getmembers 1238s 0.36s call tests/frame/methods/test_info.py::test_info_verbose_with_counts_spacing[10001- # Column Non-Null Count Dtype ---- ------ -------------- ----- - 0 0 3 non-null float64- 10000 10000 3 non-null float64] 1238s 0.26s call tests/frame/methods/test_cov_corr.py::TestDataFrameCorr::test_corr_scipy_method[kendall] 1238s 0.24s call tests/frame/test_stack_unstack.py::TestDataFrameReshape::test_stack_partial_multiIndex[True-level4-multiindex_columns15] 1238s 0.21s call tests/frame/methods/test_to_csv.py::TestDataFrameToCSV::test_to_csv_with_dst_transitions_with_pickle 1238s 0.20s call tests/frame/methods/test_to_csv.py::TestDataFrameToCSV::test_to_csv_chunking[10000] 1238s 0.20s call tests/frame/test_constructors.py::TestDataFrameConstructors::test_dict_nocopy[Int64-timedelta64[ns]-True] 1238s 0.19s call tests/frame/methods/test_to_csv.py::TestDataFrameToCSV::test_to_csv_chunking[50000] 1238s 0.19s call tests/frame/methods/test_to_csv.py::TestDataFrameToCSV::test_to_csv_chunking[100000] 1238s 0.16s call tests/frame/methods/test_cov_corr.py::TestDataFrameCorrWith::test_corrwith[Float64] 1238s 0.15s call tests/frame/methods/test_info.py::test_info_verbose_with_counts_spacing[1001- # Column Non-Null Count Dtype ---- ------ -------------- ----- - 0 0 3 non-null float64- 1000 1000 3 non-null float64] 1238s 0.14s call tests/frame/methods/test_rename.py::TestRename::test_rename_with_duplicate_columns 1238s 0.13s call tests/frame/test_stack_unstack.py::TestStackUnstackMultiLevel::test_stack[False] 1238s 0.11s call tests/frame/methods/test_at_time.py::TestAtTime::test_at_time_axis[0] 1238s 0.11s call tests/frame/methods/test_assign.py::TestAssign::test_assign_dependent 1238s 0.11s call tests/frame/methods/test_at_time.py::TestAtTime::test_at_time_axis[index] 1238s 0.11s call tests/frame/test_stack_unstack.py::TestStackUnstackMultiLevel::test_stack_order_with_unsorted_levels_multi_row[False] 1238s 0.10s call tests/frame/test_stack_unstack.py::TestStackUnstackMultiLevel::test_stack[True] 1238s 0.09s call tests/frame/test_stack_unstack.py::TestStackUnstackMultiLevel::test_stack_unstack_multiple[False] 1238s 0.09s call tests/frame/test_repr.py::TestDataFrameRepr::test_repr_to_string 1238s 0.08s call tests/frame/methods/test_to_csv.py::TestDataFrameToCSV::test_to_csv_dups_cols 1238s 0.08s call tests/frame/test_stack_unstack.py::TestStackUnstackMultiLevel::test_stack_unstack_multiple[True] 1238s 0.08s call tests/frame/test_repr.py::TestDataFrameRepr::test_repr_bytes_61_lines 1238s 0.08s call tests/frame/test_stack_unstack.py::TestStackUnstackMultiLevel::test_stack_order_with_unsorted_levels_multi_row[True] 1238s 0.07s call tests/frame/methods/test_interpolate.py::TestDataFrameInterpolate::test_interp_string_axis[index-0] 1238s 0.06s call tests/frame/test_block_internals.py::TestDataFrameBlockInternals::test_strange_column_corruption_issue 1238s 0.06s call tests/frame/methods/test_interpolate.py::TestDataFrameInterpolate::test_interp_string_axis[columns-1] 1238s 0.05s call tests/frame/test_stack_unstack.py::TestDataFrameReshape::test_stack_int_level_names[False] 1238s = 10435 passed, 209 skipped, 433 deselected, 58 xfailed, 38 xpassed, 2 warnings in 45.05s = 1239s rdjoqkol test state = false 1239s + echo 'rdjoqkol test state = false' 1239s + for TEST_SUBSET in $modpath/tests/* 1239s + echo /usr/lib/python3/dist-packages/pandas/tests/generic 1239s + grep -q -e __pycache__ 1239s + PANDAS_CI=1 1239s + LC_ALL=C.UTF-8 1239s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/generic 1240s ============================= test session starts ============================== 1240s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1240s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1240s rootdir: /usr/lib/python3/dist-packages/pandas 1240s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1240s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1240s asyncio: mode=Mode.STRICT 1240s collected 1249 items 1240s 1240s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_duplicate_labels.py ..........xx...........x.......xx.xxx................x................ 1244s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_finalize.py ..........................x..................................x........x....................................................................................................................xs..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s..x..x.x...x.s..s...s..s..x..x.x...x.s..s...s..s..x..x.x...x.s..s...s..s..x..x.x...x.s..s...s..s..x..x.x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x.s..s...s..s..x..x.x...x.s..s...s..s.x..x..x...x........................................................................................xxxxxxxxx..........xxxxxxxxxxxx. 1244s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_frame.py ............... 1244s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_generic.py ................................................................................. 1244s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_label_or_level_utils.py ....................................................................... 1244s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_series.py ................... 1244s ../../../usr/lib/python3/dist-packages/pandas/tests/generic/test_to_xarray.py ......................s....................................... 1244s 1244s =============================== warnings summary =============================== 1244s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1244s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-yq4hxb68' 1244s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1244s 1244s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1244s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-_79r4_ka' 1244s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1244s 1244s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1244s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1244s ============================= slowest 30 durations ============================= 1244s 0.13s call tests/generic/test_to_xarray.py::TestSeriesToXArray::test_to_xarray_index_types[string] 1244s 0.03s call tests/generic/test_generic.py::TestGeneric::test_truncate_out_of_bounds[DataFrame] 1244s 0.01s call tests/generic/test_generic.py::TestGeneric::test_truncate_out_of_bounds[Series] 1244s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[MultiIndex] 1244s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[IntervalIndex] 1244s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[CategoricalIndex] 1244s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[TimedeltaIndex] 1244s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[Index2] 1244s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[PeriodIndex] 1244s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[DatetimeIndex] 1244s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[Index1] 1244s 0.01s call tests/generic/test_duplicate_labels.py::test_raises_basic[Index0] 1244s 0.01s call tests/generic/test_to_xarray.py::TestDataFrameToXArray::test_to_xarray_with_multiindex 1244s 0.01s call tests/generic/test_to_xarray.py::TestDataFrameToXArray::test_to_xarray_index_types[string] 1244s 0.01s call tests/generic/test_finalize.py::test_groupby_finalize_not_implemented[5-obj1] 1244s 0.01s teardown tests/generic/test_to_xarray.py::TestSeriesToXArray::test_to_xarray_with_multiindex 1244s 1244s (14 durations < 0.005s hidden. Use -vv to show these durations.) 1244s ========== 1006 passed, 105 skipped, 138 xfailed, 2 warnings in 4.68s ========== 1245s rdjoqkol test state = false 1245s + echo 'rdjoqkol test state = false' 1245s + for TEST_SUBSET in $modpath/tests/* 1245s + echo /usr/lib/python3/dist-packages/pandas/tests/groupby 1245s + grep -q -e __pycache__ 1245s + PANDAS_CI=1 1245s + LC_ALL=C.UTF-8 1245s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/groupby 1248s ============================= test session starts ============================== 1248s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1248s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1248s rootdir: /usr/lib/python3/dist-packages/pandas 1248s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1248s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1248s asyncio: mode=Mode.STRICT 1248s collected 29434 items / 1832 deselected / 1 skipped / 27602 selected 1248s 1249s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_aggregate.py ...............................................................................................................................................................................................................................................................................................................................................x..x.......................................................................x....... 1250s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_cython.py ........................................................................................................................................................................ 1250s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_numba.py sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 1250s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_other.py ........................................ 1250s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_corrwith.py . 1250s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_describe.py ......................... 1250s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_groupby_shift_diff.py ............................................... 1250s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_is_monotonic.py ...... 1250s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_nlargest_nsmallest.py ........................................... 1251s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_nth.py .................................................................................................................................................................................................................................... 1252s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_quantile.py ..................................................................................................x....x.......................................................................................................................................................................................................................... 1255s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_rank.py ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1255s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_sample.py .............. 1257s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_size.py .......x....x....x....x....x....x....x....x...............ss 1257s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_skew.py . 1258s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/methods/test_value_counts.py ...........X......XXX...XXX.........XXX...XXX........................................ss.ss.ss.ss.ss.ss....XX..........................XXXXXXXXXXXXXXXX........XXXXXXXX...................................... 1258s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_all_methods.py ......................................xx..............................................................................................ss..........ss..............ss...... 1258s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_api.py ......s..s..............................s..s.......................... 1259s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_apply.py .................................................................................................................................... 1259s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_apply_mutate.py ..... 1259s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_bin_groupby.py ...... 1270s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_categorical.py .................................................................................................................................x......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ss.ss.ss.ssxxxxxx.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssxxxxxx.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssxxxxxx.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ssxxxxxx.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s........s........s........s........s........s........s........s...ss.sssss.ss.sssss.ss.sssss.ss.sssss.ss.sssss.ss.sssss.ss.sssss.ss.sssss...................ss.ss.ss.ss.ss.ss.................sxsx................................................................................sxsx................................................................................sxsx................................................................................sxsx.......................................x................................ 1270s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_counting.py .................................ssss 1270s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_cumulative.py ..................................................... 1270s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_filters.py ............................ 1277s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_groupby.py ......................................s..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s...................s................................................................................................................... 1294s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_groupby_dropna.py ..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss...........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................xxxxxxxx.................................................. 1295s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_groupby_subclass.py .....s................................................................. 1295s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_grouping.py ....................................................................................... 1295s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_index_as_string.py .................. 1295s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_indexing.py ............................................................................................................................................................................ 1295s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_libgroupby.py ........................... 1295s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_missing.py ......................... 1296s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_numeric_only.py ..................xxx......ssssss..............................xxx.................................................................................................................................................................................................................................................................................................................................................................................. 1296s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_pipe.py .. 1318s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_raises.py ..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1319s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_reductions.py ...............................................................................................................................................................................................................................................................................................................ss........................................................................................................................................................ 1320s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_timegrouper.py ..............................s 1320s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/transform/test_numba.py sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 1329s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/transform/test_transform.py ....................x.........................................................................................................................x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x....................................................................................x....................x............................................................................................................................................................................................................ 1329s 1329s =============================== warnings summary =============================== 1329s tests/groupby/test_categorical.py::test_basic 1329s /usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:86: FutureWarning: The behavior of DataFrame.sum with axis=None is deprecated, in a future version this will reduce over both axes and return a scalar. To retain the old behavior, pass axis=0 (or do not pass axis) 1329s return reduction(axis=axis, out=out, **passkwargs) 1329s 1329s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1329s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-20xoa1dk' 1329s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1329s 1329s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1329s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-f6nv2159' 1329s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1329s 1329s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1329s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1329s ============================= slowest 30 durations ============================= 1329s 0.36s call tests/groupby/test_raises.py::test_groupby_raises_category[by3-False-any-agg] 1329s 0.27s call tests/groupby/test_reductions.py::test_ops_general[sem-scipy_sem] 1329s 0.23s call tests/groupby/test_groupby_dropna.py::test_groupby_dropna_with_multiindex_input[True-keys1-None] 1329s 0.12s call tests/groupby/methods/test_quantile.py::test_quantile[0-a_vals1-b_vals1-nearest] 1329s 0.12s teardown tests/groupby/transform/test_transform.py::test_idxmin_idxmax_transform_args[False-False-idxmin] 1329s 0.07s call tests/groupby/test_categorical.py::test_basic 1329s 0.06s call tests/groupby/test_counting.py::test_count 1329s 0.06s call tests/groupby/test_counting.py::TestCounting::test_ngroup_cumcount_pair 1329s 0.05s call tests/groupby/test_timegrouper.py::TestGroupBy::test_timegrouper_with_reg_groups 1329s 0.05s call tests/groupby/test_groupby_subclass.py::test_groupby_preserves_subclass[corrwith-obj0] 1329s 0.04s call tests/groupby/test_categorical.py::test_datetime 1329s 0.04s call tests/groupby/methods/test_describe.py::test_frame_describe_multikey 1329s 0.04s call tests/groupby/test_apply.py::test_apply_concat_preserve_names 1329s 0.03s call tests/groupby/test_groupby.py::test_groupby_multiindex_not_lexsorted 1329s 0.03s call tests/groupby/transform/test_transform.py::test_as_index_no_change[corrwith-keys1] 1329s 0.03s call tests/groupby/test_apply.py::test_apply_corner_cases 1329s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-False-True-multi] 1329s 0.02s call tests/groupby/test_categorical.py::test_observed[False] 1329s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-False-False-multi] 1329s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-True-False-multi] 1329s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-True-True-multi] 1329s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-True-True-True-multi] 1329s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-True-False-True-multi] 1329s 0.02s call tests/groupby/test_categorical.py::test_describe_categorical_columns 1329s 0.02s call tests/groupby/test_groupby.py::test_groupby_as_index_agg 1329s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-True-False-False-multi] 1329s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-True-True-False-multi] 1329s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-False-True-single] 1329s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-False-False-range] 1329s 0.02s call tests/groupby/test_groupby_dropna.py::test_categorical_reducers[corrwith-False-False-False-single] 1329s = 26564 passed, 910 skipped, 1832 deselected, 90 xfailed, 39 xpassed, 3 warnings in 83.55s (0:01:23) = 1331s rdjoqkol test state = false 1331s + echo 'rdjoqkol test state = false' 1331s + for TEST_SUBSET in $modpath/tests/* 1331s + echo /usr/lib/python3/dist-packages/pandas/tests/indexes 1331s + grep -q -e __pycache__ 1331s + PANDAS_CI=1 1331s + LC_ALL=C.UTF-8 1331s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/indexes 1336s ============================= test session starts ============================== 1336s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1336s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1336s rootdir: /usr/lib/python3/dist-packages/pandas 1336s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1336s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1336s asyncio: mode=Mode.STRICT 1336s collected 16998 items / 4 deselected / 16994 selected 1336s 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_constructors.py .......s.. 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_formats.py .............. 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_indexing.py ............ 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_pickle.py . 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_reshape.py ..............s.... 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_setops.py ............................................................ 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/base_class/test_where.py . 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_append.py ....... 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_astype.py ........... 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_category.py ......................................... 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_constructors.py ..... 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_equals.py ......ss 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_fillna.py ... 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_formats.py .. 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_indexing.py ................................. 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_map.py ..................... 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_reindex.py ....... 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/categorical/test_setops.py .. 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_drop_duplicates.py ................................................................................................................ 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_equals.py ..................... 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_indexing.py ................ 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_is_monotonic.py . 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_nat.py ...................... 1336s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_sort_values.py ................................................................................... 1337s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimelike_/test_value_counts.py ............................................ 1337s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_asof.py .. 1337s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_astype.py ................................. 1337s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_delete.py ....................... 1337s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_factorize.py .................................................................................... 1337s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_fillna.py .. 1337s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_insert.py ............................................................................................................................................................................................. 1337s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_isocalendar.py .. 1337s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_map.py ..... 1337s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_normalize.py ......... 1338s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_repeat.py .................................................................................................................................................................................................................................................................................................................................................... 1338s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_resolution.py .................................................................................................................................................................................... 1338s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_round.py ...................................................................................................................................................................................................................... 1339s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_shift.py ........................................................................................................................................ 1339s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_snap.py ........................ 1339s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_to_frame.py .. 1339s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_to_julian_date.py ..... 1339s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_to_period.py ........................................... 1339s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_to_pydatetime.py .. 1339s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_to_series.py . 1339s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_tz_convert.py .................................... 1339s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_tz_localize.py ............................................................................................................................................................ 1339s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/methods/test_unique.py ........................ 1339s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_arithmetic.py .....................x 1340s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_constructors.py ..................................................................................................................................................................................................................x...x.................................... 1341s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_date_range.py .................................................................................................................................................................................................................................................................................................................................................. 1341s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_datetime.py .................. 1341s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_formats.py ........................................ 1341s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_freq_attr.py .......................... 1342s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_indexing.py .......................................................................................................................................................................................................................................................................................................................................................................................... 1342s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_iter.py ............ 1342s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_join.py ...................... 1342s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_npfuncs.py . 1342s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_ops.py ................ 1342s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_partial_slicing.py .................................. 1342s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_pickle.py ...... 1342s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_reindex.py .. 1347s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_scalar_compat.py ..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1347s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_setops.py ................................................................................................................................ 1347s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/datetimes/test_timezones.py ........................................ 1348s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_astype.py ....................................x........................................................................................................................... 1348s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_constructors.py ............................................................................................................................................................................................................................................................................ssssssss.......................................s.................s.....s.....s.....s....................................ssssssss.......................................s.................s.....s.....s.....s.................................s 1348s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_equals.py .... 1348s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_formats.py ........... 1349s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_indexing.py ............................................................................................................................................................................................................................................................................................ 1349s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_interval.py .......x....x....x....x.................................................................................................................................................................................................................................. 1350s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_interval_range.py ........................................................................................................................................................ 1350s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_interval_tree.py .................................................................................................................................................................................................................... 1350s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_join.py ... 1350s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_pickle.py ... 1351s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/interval/test_setops.py ................................................................................. 1351s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_analytics.py ...................................... 1351s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_astype.py ... 1351s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_compat.py ...... 1351s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_constructors.py ..................................................................................s................. 1351s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_conversion.py ...... 1351s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_copy.py .......... 1351s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_drop.py ............. 1351s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_duplicates.py .................................................. 1351s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_equivalence.py .............. 1351s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_formats.py .............. 1351s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_get_level_values.py ....... 1351s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_get_set.py ................... 1352s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_indexing.py ........................................................................................................................................... 1352s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_integrity.py ................ 1352s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_isin.py .............. 1352s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_join.py ....................................................... 1352s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_lexsort.py .. 1352s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_missing.py ...x.. 1352s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_monotonic.py ........... 1352s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_names.py ............................... 1352s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_partial_indexing.py ..... 1352s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_pickle.py . 1352s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_reindex.py ............ 1352s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_reshape.py ........... 1353s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_setops.py ............................................................................................................................................................................................................sss................................................................... 1353s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_sorting.py .......................... 1353s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/multi/test_take.py ... 1353s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/numeric/test_astype.py ................... 1353s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/numeric/test_indexing.py ..............................................................................................................................................ss.......................................... 1353s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/numeric/test_join.py ........... 1354s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/numeric/test_numeric.py ................................................................................................ 1354s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/numeric/test_setops.py .................... 1354s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/object/test_astype.py .. 1354s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/object/test_indexing.py .......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.ss. 1354s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_asfreq.py ............... 1354s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_astype.py ............. 1354s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_factorize.py .. 1354s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_fillna.py . 1354s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_insert.py ... 1354s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_is_full.py . 1354s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_repeat.py ...... 1354s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_shift.py ...... 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/methods/test_to_timestamp.py ........ 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_constructors.py .................................................................................................. 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_formats.py ................... 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_freq_attr.py . 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_indexing.py ......................................................................... 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_join.py ........... 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_monotonic.py .. 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_partial_slicing.py .............. 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_period.py .................................................................................................................................... 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_period_range.py ........................... 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_pickle.py .... 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_resolution.py ......... 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_scalar_compat.py ... 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_searchsorted.py ........ 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_setops.py .............. 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/period/test_tools.py ............ 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/ranges/test_constructors.py ............................. 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/ranges/test_indexing.py ............... 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/ranges/test_join.py .......... 1355s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/ranges/test_range.py ................................................................................. 1356s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/ranges/test_setops.py ................................................................... 1357s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_any_index.py ......................................................................................................................s......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1358s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_base.py .........................................................................................................................................................................x.............................................................................ssss....ss..........ss......ss.................................................................................................................................ssss...........................................................................................................................................................................................................................................s.......................................................................................................s..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s 1359s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_common.py ...........................................................................................................................................................................................................xxxxxxxxxxxxxxxxxxxxxxxxxxxxx.........................................................................................................................sssssssss...s....ss..........................xs.....................sss................................................sss....................................................................................s................s...............................................................................................................................................................................................................................................................................XX........................................... 1359s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_datetimelike.py ........................................ 1360s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_engines.py ......................................... 1360s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_frozen.py .......... 1360s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_index_new.py ............................................xxxxssss................................................................................................................ 1360s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_indexing.py ......................................................ss...............................s.................................................................................................................................................................................................................................................................................................s........................ 1361s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_numpy_compat.py ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ss..................... 1363s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_old_base.py s...s...................sss.............................ssssssssss.s..........ss.................s.............s.....s..............s..sss..........................................................s.......................................................................ssssssss..s..sssssssss..s..sssssssss..s..sssssssss..s..sssssssss..s..s......................s..............................................s................s..............................s........................ssssssss....s.s...s.....s........sssssssss...s....s...sss...................................................................................................................ss......................ssssss.........................................................................................................................................................................s......................................................................................................................................................................................s...s...........s...s...........................................................................................s...s... 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_setops.py ...........................................................................................................................................x........................................................................................................................................................................................................................................................................................................................................................................................................................X..................................................................x....................................................................................................X.........X...............................................................................................................X..........................................................................................................................................................................................................................................................................................................................................................s...........................................................................................................................ss..s.s...s...s.....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ssss....ss..........ss......ss..................................................................................................................................................................................................................................................................ssss....ss..........ss......ss................................................................................................................................................................................................................................................................s...........................................................................................................................................................................................ss 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/test_subclass.py . 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/methods/test_astype.py ............... 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/methods/test_factorize.py .. 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/methods/test_fillna.py . 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/methods/test_insert.py ............... 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/methods/test_repeat.py . 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/methods/test_shift.py ...... 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_arithmetic.py ... 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_constructors.py ..................... 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_delete.py ... 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_formats.py ..... 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_freq_attr.py ........... 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_indexing.py .................................... 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_join.py ....... 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_ops.py .......... 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_pickle.py . 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_scalar_compat.py ........ 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_searchsorted.py ........ 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_setops.py ................................ 1367s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_timedelta.py ... 1368s ../../../usr/lib/python3/dist-packages/pandas/tests/indexes/timedeltas/test_timedelta_range.py ............................ 1368s 1368s =============================== warnings summary =============================== 1368s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1368s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-68mi882f' 1368s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1368s 1368s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1368s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-kl3plyj6' 1368s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1368s 1368s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1368s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1368s ============================= slowest 30 durations ============================= 1368s 0.34s call tests/indexes/ranges/test_setops.py::test_range_difference 1368s 0.24s call tests/indexes/test_numpy_compat.py::test_numpy_ufuncs_basic[empty-tanh] 1368s 0.11s call tests/indexes/datetimes/methods/test_round.py::TestDatetimeIndexRound::test_round3['UTC+01:15'] 1368s 0.10s call tests/indexes/multi/test_indexing.py::test_pyint_engine 1368s 0.09s teardown tests/indexes/timedeltas/test_timedelta_range.py::TestTimedeltas::test_timedelta_range_deprecated_freq[2.5T-5 hours-5 hours 8 minutes-expected_values1-150s] 1368s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_begin[s] 1368s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_begin[ms] 1368s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_end[s] 1368s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_begin[us] 1368s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_begin[ns] 1368s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_end[ms] 1368s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_end[ns] 1368s 0.09s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonTickFreq::test_date_range_custom_business_month_end[us] 1368s 0.07s call tests/indexes/datetimes/methods/test_tz_localize.py::TestTZLocalize::test_dti_tz_localize_roundtrip[tzlocal()] 1368s 0.07s call tests/indexes/period/test_indexing.py::TestGetItem::test_getitem_seconds 1368s 0.06s call tests/indexes/period/test_partial_slicing.py::TestPeriodIndex::test_range_slice_seconds[period_range] 1368s 0.05s call tests/indexes/datetimes/methods/test_tz_localize.py::TestTZLocalize::test_dti_tz_localize_roundtrip[zoneinfo.ZoneInfo(key='US/Pacific')] 1368s 0.04s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[right-1] 1368s 0.04s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[left-1] 1368s 0.04s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[both-1] 1368s 0.04s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[neither-1] 1368s 0.03s call tests/indexes/multi/test_sorting.py::test_remove_unused_levels_large[datetime64[D]-str] 1368s 0.03s call tests/indexes/datetimes/test_date_range.py::TestDateRangeNonNano::test_date_range_freq_matches_reso 1368s 0.03s call tests/indexes/datetimes/test_timezones.py::TestDatetimeIndexTimezones::test_with_tz[tz1] 1368s 0.03s call tests/indexes/multi/test_integrity.py::test_consistency 1368s 0.03s call tests/indexes/datetimes/test_timezones.py::TestDatetimeIndexTimezones::test_with_tz[tz0] 1368s 0.03s call tests/indexes/datetimes/test_constructors.py::TestDatetimeIndex::test_constructor_datetime64_tzformat[W-SUN] 1368s 0.02s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[right-10] 1368s 0.02s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[left-10] 1368s 0.02s call tests/indexes/interval/test_interval_tree.py::TestIntervalTree::test_get_indexer_closed[both-10] 1368s = 16688 passed, 254 skipped, 4 deselected, 46 xfailed, 6 xpassed, 2 warnings in 35.91s = 1369s + echo 'rdjoqkol test state = false' 1369s + for TEST_SUBSET in $modpath/tests/* 1369s rdjoqkol test state = false 1369s + echo /usr/lib/python3/dist-packages/pandas/tests/indexing 1369s + grep -q -e __pycache__ 1369s + PANDAS_CI=1 1369s + LC_ALL=C.UTF-8 1369s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/indexing 1370s ============================= test session starts ============================== 1370s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1370s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1370s rootdir: /usr/lib/python3/dist-packages/pandas 1370s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1370s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1370s asyncio: mode=Mode.STRICT 1370s collected 4389 items 1370s 1371s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/interval/test_interval.py .............................. 1371s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/interval/test_interval_new.py ..................... 1371s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_chaining_and_caching.py .. 1371s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_datetime.py .. 1371s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_getitem.py ............................................................................. 1371s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_iloc.py ................ 1372s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_indexing_slow.py .......... 1373s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_loc.py ................................................................................................................................. 1373s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_multiindex.py ................ 1373s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_partial.py ............. 1373s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_setitem.py ........................... 1373s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_slice.py ............................. 1373s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/multiindex/test_sorted.py ......... 1373s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_at.py ......................................... 1373s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_categorical.py .................s................................................................................................ 1373s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_chaining_and_caching.py .............................. 1373s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_check_indexer.py ....................s.... 1374s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_coercion.py ..........xxxxxxx...............................................................xx............................................xxxx....x............................................................xxxxx..................xx............................................................................................................................................................................................................x 1374s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_datetime.py .........ss 1375s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_floats.py ............................................................................................................................................... 1375s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_iat.py ..... 1375s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_iloc.py .................................................................................................................................................................................................................. 1375s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_indexers.py ...... 1377s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_indexing.py ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 1380s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_loc.py .............................................................................................................................x...........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s...................................................................................................................................................................................................................................................................................................................................................................s.................................... 1380s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_na_indexing.py .............................................................................................................................................................................................................................................................................. 1380s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_partial.py .................................... 1380s ../../../usr/lib/python3/dist-packages/pandas/tests/indexing/test_scalar.py ...................................... 1380s 1380s =============================== warnings summary =============================== 1380s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1380s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-23igfq6r' 1380s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1380s 1380s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1380s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-qp1jnpmj' 1380s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1380s 1380s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1380s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1380s ============================= slowest 30 durations ============================= 1380s 0.30s call tests/indexing/test_loc.py::TestLocBaseIndependent::test_loc_non_unique_memory_error[900000-100000] 1380s 0.15s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[a-3] 1380s 0.10s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[a-4] 1380s 0.10s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[a-2] 1380s 0.10s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[a-1] 1380s 0.10s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[a-0] 1380s 0.09s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[b-4] 1380s 0.09s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[b-3] 1380s 0.09s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[b-2] 1380s 0.09s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[b-1] 1380s 0.08s call tests/indexing/test_na_indexing.py::test_series_mask_boolean[True-Series-mask0-values2-float64] 1380s 0.08s call tests/indexing/multiindex/test_indexing_slow.py::test_multiindex_get_loc[b-0] 1380s 0.06s call tests/indexing/test_indexing.py::TestDatetimelikeCoercion::test_setitem_dt64_string_scalar[zoneinfo.ZoneInfo(key='UTC')-loc] 1380s 0.06s call tests/indexing/test_chaining_and_caching.py::TestChaining::test_detect_chained_assignment_implicit_take2 1380s 0.05s call tests/indexing/test_chaining_and_caching.py::TestChaining::test_detect_chained_assignment_setting_entire_column 1380s 0.05s call tests/indexing/test_chaining_and_caching.py::TestChaining::test_detect_chained_assignment_str 1380s 0.05s call tests/indexing/test_chaining_and_caching.py::TestChaining::test_detect_chained_assignment_implicit_take 1380s 0.05s call tests/indexing/test_loc.py::TestLocBaseIndependent::test_loc_getitem_range_from_spmatrix[int64-coo_matrix] 1380s 0.04s call tests/indexing/multiindex/test_setitem.py::TestMultiIndexSetItem::test_groupby_example 1380s 0.03s call tests/indexing/multiindex/test_slice.py::TestMultiIndexSlicers::test_per_axis_per_level_setitem 1380s 0.02s call tests/indexing/test_loc.py::TestLocSeries::test_loc_nonunique_masked_index 1380s 0.02s call tests/indexing/multiindex/test_slice.py::TestMultiIndexSlicers::test_per_axis_per_level_getitem 1380s 0.02s teardown tests/indexing/test_scalar.py::TestMultiIndexScalar::test_multiindex_at_get_one_level 1380s 0.01s call tests/indexing/multiindex/test_setitem.py::TestMultiIndexSetItem::test_setitem_multiindex3 1380s 0.01s call tests/indexing/multiindex/test_slice.py::TestMultiIndexSlicers::test_per_axis_per_level_doc_examples 1380s 0.01s call tests/indexing/multiindex/test_slice.py::TestMultiIndexSlicers::test_loc_axis_arguments 1380s 0.01s call tests/indexing/test_loc.py::TestLocCallable::test_frame_loc_setitem_callable 1380s 0.01s call tests/indexing/multiindex/test_slice.py::TestMultiIndexSlicers::test_multiindex_slicers_non_unique 1380s 0.01s call tests/indexing/test_categorical.py::TestCategoricalIndex::test_ix_categorical_index_non_unique[False] 1380s 0.01s call tests/indexing/test_chaining_and_caching.py::TestChaining::test_detect_chained_assignment_false_positives 1380s =========== 4360 passed, 6 skipped, 23 xfailed, 2 warnings in 10.94s =========== 1381s rdjoqkol test state = false 1381s + echo 'rdjoqkol test state = false' 1381s + for TEST_SUBSET in $modpath/tests/* 1381s + echo /usr/lib/python3/dist-packages/pandas/tests/interchange 1381s + grep -q -e __pycache__ 1381s + PANDAS_CI=1 1381s + LC_ALL=C.UTF-8 1381s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/interchange 1381s ============================= test session starts ============================== 1381s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1381s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1381s rootdir: /usr/lib/python3/dist-packages/pandas 1381s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1381s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1381s asyncio: mode=Mode.STRICT 1381s collected 140 items 1381s 1382s ../../../usr/lib/python3/dist-packages/pandas/tests/interchange/test_impl.py ..sssssssss..............sssss........s..s..ssssssssssssssssssssssssssssssss. 1382s ../../../usr/lib/python3/dist-packages/pandas/tests/interchange/test_spec_conformance.py ................ 1382s ../../../usr/lib/python3/dist-packages/pandas/tests/interchange/test_utils.py ................sssssssssssssssssssssssssssssss 1382s 1382s =============================== warnings summary =============================== 1382s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1382s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-gyl4lrav' 1382s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1382s 1382s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1382s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-3sd1xfxm' 1382s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1382s 1382s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1382s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1382s ============================= slowest 30 durations ============================= 1382s 0.02s setup tests/interchange/test_impl.py::test_empty_categorical_pyarrow 1382s 0.02s call tests/interchange/test_impl.py::test_dataframe[4] 1382s 0.01s call tests/interchange/test_impl.py::test_dataframe[3] 1382s 0.01s call tests/interchange/test_impl.py::test_dataframe[0] 1382s 0.01s call tests/interchange/test_impl.py::test_dataframe[1] 1382s 0.01s call tests/interchange/test_impl.py::test_dataframe[2] 1382s 1382s (24 durations < 0.005s hidden. Use -vv to show these durations.) 1382s ================== 61 passed, 79 skipped, 2 warnings in 0.42s ================== 1382s rdjoqkol test state = false 1382s + echo 'rdjoqkol test state = false' 1382s + for TEST_SUBSET in $modpath/tests/* 1382s + echo /usr/lib/python3/dist-packages/pandas/tests/internals 1382s + grep -q -e __pycache__ 1382s + PANDAS_CI=1 1382s + LC_ALL=C.UTF-8 1382s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/internals 1383s ============================= test session starts ============================== 1383s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1383s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1383s rootdir: /usr/lib/python3/dist-packages/pandas 1383s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1383s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1383s asyncio: mode=Mode.STRICT 1383s collected 257 items 1383s 1383s ../../../usr/lib/python3/dist-packages/pandas/tests/internals/test_api.py ......... 1383s ../../../usr/lib/python3/dist-packages/pandas/tests/internals/test_internals.py .................................................................................................................................................................................................................................................... 1384s ../../../usr/lib/python3/dist-packages/pandas/tests/internals/test_managers.py .... 1384s 1384s =============================== warnings summary =============================== 1384s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1384s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-f5n99xjf' 1384s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1384s 1384s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1384s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-xo54kano' 1384s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1384s 1384s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1384s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1384s ============================= slowest 30 durations ============================= 1384s 0.33s call tests/internals/test_managers.py::test_array_manager_depr_env_var[block] 1384s 0.31s call tests/internals/test_managers.py::test_array_manager_depr_env_var[array] 1384s 0.01s call tests/internals/test_internals.py::TestBlockManager::test_equals_block_order_different_dtypes[c:sparse;d:sparse_na;b:f8] 1384s 0.01s call tests/internals/test_internals.py::TestCanHoldElement::test_interval_can_hold_element[4-uint64] 1384s 0.01s call tests/internals/test_internals.py::TestBlockManager::test_equals_block_order_different_dtypes[a:i8;e:dt;f:td;g:string] 1384s 0.01s call tests/internals/test_internals.py::TestCanHoldElement::test_interval_can_hold_element[4-float64] 1384s 1384s (24 durations < 0.005s hidden. Use -vv to show these durations.) 1384s ======================= 257 passed, 2 warnings in 1.45s ======================== 1384s + echo 'rdjoqkol test state = false' 1384s rdjoqkol test state = false 1384s + for TEST_SUBSET in $modpath/tests/* 1384s + echo /usr/lib/python3/dist-packages/pandas/tests/io 1384s + grep -q -e __pycache__ 1384s + PANDAS_CI=1 1384s + LC_ALL=C.UTF-8 1384s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/io 1390s ============================= test session starts ============================== 1390s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1390s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1390s rootdir: /usr/lib/python3/dist-packages/pandas 1390s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1390s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1390s asyncio: mode=Mode.STRICT 1390s collected 15511 items / 201 deselected / 2 skipped / 15310 selected 1390s 1390s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_odf.py ..... 1390s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_odswriter.py .......... 1391s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_openpyxl.py ...................................................... 1403s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_readers.py ......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss....................................ssssss......ssssssssssssssssssssssssssssss............ss..ssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss............ss..ssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss............ss..sssssssssss..s..s.sssssssssssssssssss..s..s.sssssssssssssssssssssssssssssss............ss..ssssssssss......s.sssssssssssssssssss..ssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss.xx.xxs.sssss......s.sssss......s.sssss......s.sssss....................................ssssss......ssssssssssssssssssssssssssssss......s.sssss......s.sssss......s.sssssssssssssssssssssssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss............ss..ssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss....................................ssssss......ssssssssssssssssssssssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......sssssss......sssssss......s.sssss......s.sssss......s.sssss......s.sssss..................sss...sssssssssssssss......s.sssss......s.sssss....................................ssssss......ssssssssssssssssssssssssssssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss......s.sssss............ss..ssssssssss......s.sssss......s.sssss......sssssss...ssss.sssss 1404s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_style.py ...................................................................................................................................s 1414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_writers.py .......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_xlrd.py ....... 1414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/excel/test_xlsxwriter.py ..... 1414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_bar.py ..........................................................................................................................s 1414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_exceptions.py ... 1414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_format.py ......................................................................................................... 1415s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_highlight.py ................................................................................................ 1415s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_html.py ........................................................................................... 1415s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_matplotlib.py ........................................................... 1415s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_non_unique.py ......... 1416s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_style.py ................................................................................................................................................................................................. 1416s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_to_latex.py ............................................................................................................................................................ 1416s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_to_string.py ..... 1416s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/style/test_tooltip.py .... 1416s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_console.py ........... 1417s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_css.py ............................................................................................... 1417s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_eng_formatting.py ....... 1418s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_format.py .................................................................................................................................................................................. 1418s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_ipython_compat.py ss..s 1418s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_printing.py ......... 1418s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_to_csv.py ..........s.................................................................................. 1418s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_to_excel.py .......................................................................................................................................................................................................................................................................................................................................................................................................................... 1419s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_to_html.py .....................................................................................................................................................................................................................................................................................................................................................................s................... 1419s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_to_latex.py ............................................................................................ 1419s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_to_markdown.py .......... 1420s ../../../usr/lib/python3/dist-packages/pandas/tests/io/formats/test_to_string.py ...............................................s........................................... 1420s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_compression.py ........sssssss.................s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s....... 1420s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_deprecated_kwargs.py . 1421s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_json_table_schema.py ...........................................................................................x...x...x...x...x...x...x...x...x...x......x...x...x...x...x...x...x...x...x...x........................... 1421s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_json_table_schema_ext_dtype.py ..................... 1421s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_normalize.py ..................................................... 1424s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_pandas.py ...........................................................................................................xxxx................................xxx........................................................................................................................................................................s........xxxxxxxxxxxxxxxxxx................................................................xx.............s.....x........ssssssssssssssssssssssssssssssss...sssss.s. 1424s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_readlines.py ..s.s.....ss.s.s.s.s......ssss....s..ss..ss.s.s........... 1425s ../../../usr/lib/python3/dist-packages/pandas/tests/io/json/test_ujson.py ................................................................................................................................................................................................................... 1425s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_chunksize.py ......ss.........sss......ss...s...s......ss...s...s...s......ss...s...s...s 1426s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_common_basic.py ....s...s...s...ss..s...s......ss.........sss...s...s...s...s...s.........sss............ssss......ss...s...s......ss.........sss...s......ss...s.........sss...s...s......ss...s...s...............sssss...s...s...s......ss...s......ss......ss...s...s...s...s...s...s 1426s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_data_list.py ...s...s...s...s 1426s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_decimal.py ......ss...s 1427s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_file_buffer_url.py ...s...s...s...s...s.................................sssssssssss...s...s...s...s...s...s...s...s............ssss.....................sssssss...s...s...s...s 1427s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_float.py ...s.......s.....................sss....x.....x...ss 1428s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_index.py ......ss......ss...s............ssss...s...s...s...s...s...s...s 1428s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_inf.py ......ss......ss...s 1428s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_ints.py ...s............ssss...s......ss...s.........sss.........sss......ss......ss...s 1428s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_iterator.py ...s...s...s.........sss...s 1428s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_read_errors.py ...s...s...s.........sss...s..................ssssss...s...s...s...s...s..xs...s...s...s...s 1428s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/common/test_verbose.py ...s...s 1428s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/dtypes/test_categorical.py .........sss......ss...s...s...s...s...s...s........................ssssssss...s...s...s...s...s............ssss...s 1430s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/dtypes/test_dtypes_basic.py ............ssss...s...s...s...s................................................ssssssssssssssss...s...s.............................x.x.x...................................................................................................................................................................x.x.x.x.x.x.x.x.x.x.x.x.........................................................................................................................................................................................................................................................................................................................x.x.x.x.x.x.x.x.x.x.x.x...................................................................................................................................................................................................................s......ss...s...s........................ssssssss......ss...s...s...s...sssssssss...sssss...sssssssssssssssssssssssssssss...s...s 1431s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/dtypes/test_empty.py ...s...s...s...s...s...s...s...s........................ssssssss 1431s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_c_parser_only.py .......................................................................... 1431s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_comment.py ......ss.........sss...s...s...s..................ssssss......ssxx.s 1433s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_compression.py .........sss......ss...s...s.x.x.xss......ss.x.x.xss......ss.x.x.xss......ss.x.x.xss......ss.x.x.xss......ss.x.x.xss......ss..................ssssss......................................................ssssssssssssssssss.........sss...s...s...s 1433s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_concatenate_chunks.py ss 1433s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_converters.py ...s............ssss...s...s...s......ss...s...s 1433s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_dialect.py ...s...s...s......................................................ssssssssssssssssss..................ssssss 1434s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_encoding.py ...s...s..................ssssss...s...s...............sssss......................................................ssssssssssssssssss.........sss............................................................................................................ssssssssssssssssssssssssssssssssssss...s....................sssss...s...s......ss 1434s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_header.py ...s...s......ss......ss...s...s...s.........sss.........sss.........sss.........sss...s...s...s...s......ss......ss......ss......ss...s.........sss........................ssssssss...s...s...s...s...s...s...s...s...ss 1434s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_index_col.py ......ss...s...s...s..............................ssssssssss...s...............sssss...s...s...s...s...s...s...s...s......ss...s 1434s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_mangle_dupes.py ...s...s...s.........sss.........sss...s...s...s......ss 1435s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_na_values.py ...s...s..........................................ssssssssssssss...s......ss...s...s.........sss............ssss...s...s...s......ss......ss...s......ss...s...s.........sss...s......ss...s......ss..................ssssss...s...s...s...s...s 1443s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_network.py ................ssss.....ssssssssssssssssss 1445s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_parse_dates.py ...s...s...s......ss..............ss...s...s...s...s......ss...s...s...s...s...sxxxxxxss......ss......ss......ss...s.........sss...s...s...s......ss.........sss............ssss......ss...s......ss......ss............ssss...s......ss...s......ss......ss...s...s...s............ssss...s..................ssssss.........sss......ss............................sssssssss...s...s......ss...s...s......s......ss...s...s...s......ss...s...s...s 1445s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_python_parser_only.py ..................................................................................... 1445s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_quoting.py .........sss......ss...s..................ssssss............ssss...............sssss......ss......ss......ss 1445s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_read_fwf.py ..........................................................................sss.. 1446s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_skiprows.py ......ss...s...s.........sss...s.........sss........xsss...s......ss...s...s...s...s...s 1446s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_textreader.py ..................................... 1446s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_unsupported.py ..........s...s..xs...s 1446s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/test_upcast.py ...........................ssss 1446s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/usecols/test_parse_dates.py ......ss...s...s...s............ssss 1446s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/usecols/test_strings.py ...s...s......ss......ss 1446s ../../../usr/lib/python3/dist-packages/pandas/tests/io/parser/usecols/test_usecols_basic.py ...s......ss...s......ss...s...s...s......ss............ssss...s...s...s...s...s...s......ss...s...s......ss......ss......ss.....................sssssss......ss......ss...s...s...s 1447s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_append.py ....x................ 1448s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_categorical.py ..... 1448s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_compat.py .... 1448s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_complex.py ......... 1448s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_errors.py ................ 1451s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py ...................................................xxxxxxxxx................................................................................................................................... 1451s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_keys.py .... 1451s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_put.py ...................... 1451s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_pytables_missing.py s 1452s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_read.py ....................s 1452s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_retain_attributes.py ..... 1452s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_round_trip.py ..............................s 1455s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_select.py ...x.....x............... 1457s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_store.py ............................................x..................... 1457s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_subclass.py .. 1457s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_time_series.py .... 1457s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_timezones.py .......................................................................... 1459s ../../../usr/lib/python3/dist-packages/pandas/tests/io/sas/test_byteswap.py .......... 1459s ../../../usr/lib/python3/dist-packages/pandas/tests/io/sas/test_sas.py ... 1459s ../../../usr/lib/python3/dist-packages/pandas/tests/io/sas/test_sas7bdat.py ....................... 1459s ../../../usr/lib/python3/dist-packages/pandas/tests/io/sas/test_xport.py ....... 1459s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_clipboard.py .......QStandardPaths: XDG_RUNTIME_DIR not set, defaulting to '/tmp/runtime-ubuntu' 1460s ...............................................................................................................................................................................................................................................................................................................................ssssss.. 1461s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_common.py .......................................................s....s........ss.......s.........s.......s......................................... 1462s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_compression.py ........................................................................................................................................ 1462s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_fsspec.py .........ssssssssss..........s 1462s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_gbq.py .. 1462s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_gcs.py ssssssssssssssssss. 1470s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_html.py .........ssssss...............................................................................................................................s.................... 1477s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_http_headers.py ...ss.......ss.......ss....ss 1477s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_parquet.py .ssss.ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss.ssssssss 1483s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_pickle.py ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. 1483s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_s3.py sss 1483s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_spss.py .......s.F 1495s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py ssss....ssssss....ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss...ssss....ssssss...ssss...ssss...ssss...xssssssssssssssxxxxssssss...ssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss...xssssss...xssssss...xssssss....ssssss....ssssss....ssssss...xssssss...xssssss...xssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss...xssssss...xssssss...xss.ssss...xssssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss....ssssss....ssssss...ssss...ssss...xss.ss...ssss..sssss..sssss..s.ssssss....ssssss...ssss...ssss...ssssxxxssss...ssssxxxssssssssssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss....ssssss....ssssss....ssssss..xssss..sssss..sssss..sssss...ssss...ssss...ssss...ssss..sssss...xssssss....ssssss...ssss....ss..ssss....ssssss....ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssxxxxssssssxxxxssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ssssss....ss....s..ss................ 1500s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_stata.py ..............................................................................................................................................................................................................................................................................................................................................................................s.................................................................................................................................................................................................................................................... 1500s ../../../usr/lib/python3/dist-packages/pandas/tests/io/xml/test_to_xml.py ........................................................................s...............................................................ss 1503s ../../../usr/lib/python3/dist-packages/pandas/tests/io/xml/test_xml.py ........................................................................................s............................................................................s......s......s......s......s......s......s......s......s......s......s......s..s.sss.sss. 1503s ../../../usr/lib/python3/dist-packages/pandas/tests/io/xml/test_xml_dtypes.py .............................................. 1503s 1503s =================================== FAILURES =================================== 1503s ______________________________ test_spss_metadata ______________________________ 1503s 1503s datapath = .deco at 0x762c231ea340> 1503s 1503s @pytest.mark.filterwarnings("ignore::pandas.errors.ChainedAssignmentError") 1503s @pytest.mark.filterwarnings("ignore:ChainedAssignmentError:FutureWarning") 1503s def test_spss_metadata(datapath): 1503s # GH 54264 1503s fname = datapath("io", "data", "spss", "labelled-num.sav") 1503s 1503s df = pd.read_spss(fname) 1503s metadata = { 1503s "column_names": ["VAR00002"], 1503s "column_labels": [None], 1503s "column_names_to_labels": {"VAR00002": None}, 1503s "file_encoding": "UTF-8", 1503s "number_columns": 1, 1503s "number_rows": 1, 1503s "variable_value_labels": {"VAR00002": {1.0: "This is one"}}, 1503s "value_labels": {"labels0": {1.0: "This is one"}}, 1503s "variable_to_label": {"VAR00002": "labels0"}, 1503s "notes": [], 1503s "original_variable_types": {"VAR00002": "F8.0"}, 1503s "readstat_variable_types": {"VAR00002": "double"}, 1503s "table_name": None, 1503s "missing_ranges": {}, 1503s "missing_user_values": {}, 1503s "variable_storage_width": {"VAR00002": 8}, 1503s "variable_display_width": {"VAR00002": 8}, 1503s "variable_alignment": {"VAR00002": "unknown"}, 1503s "variable_measure": {"VAR00002": "unknown"}, 1503s "file_label": None, 1503s "file_format": "sav/zsav", 1503s } 1503s if Version(pyreadstat.__version__) >= Version("1.2.4"): 1503s metadata.update( 1503s { 1503s "creation_time": datetime.datetime(2015, 2, 6, 14, 33, 36), 1503s "modification_time": datetime.datetime(2015, 2, 6, 14, 33, 36), 1503s } 1503s ) 1503s > assert df.attrs == metadata 1503s E AssertionError: assert {'column_labe... 33, 36), ...} == {'column_labe... 33, 36), ...} 1503s E 1503s E Omitting 23 identical items, use -vv to show 1503s E Left contains 1 more item: 1503s E {'mr_sets': {}} 1503s E Use -v to get more diff 1503s 1503s /usr/lib/python3/dist-packages/pandas/tests/io/test_spss.py:165: AssertionError 1503s =============================== warnings summary =============================== 1503s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1503s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-5lcyq0rg' 1503s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1503s 1503s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:429 1503s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:429: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/lastfailed: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-dz901qxz' 1503s config.cache.set("cache/lastfailed", self.lastfailed) 1503s 1503s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1503s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-sewye84d' 1503s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1503s 1503s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1503s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1503s ============================= slowest 30 durations ============================= 1503s 1.01s call tests/io/pytables/test_store.py::test_no_track_times 1503s 0.82s call tests/io/test_html.py::TestReadHtml::test_banklist_url[bs4] 1503s 0.82s call tests/io/test_pickle.py::test_pickle_big_dataframe_compression[gzip-4] 1503s 0.81s call tests/io/test_pickle.py::test_pickle_big_dataframe_compression[gzip-5] 1503s 0.57s call tests/io/test_html.py::TestReadHtml::test_wikipedia_states_table[bs4] 1503s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[storage_options2-gz_csv_responder-read_csv] 1503s 0.50s teardown tests/io/excel/test_readers.py::TestReaders::test_read_from_http_url[(None, '.xls')] 1503s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[None-gz_csv_responder-read_csv] 1503s 0.50s teardown tests/io/excel/test_readers.py::TestReaders::test_read_from_http_url[('xlrd', '.xls')] 1503s 0.50s teardown tests/io/parser/common/test_file_buffer_url.py::test_url[c_high] 1503s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[None-csv_responder-read_csv] 1503s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[storage_options2-stata_responder-read_stata] 1503s 0.50s teardown tests/io/json/test_pandas.py::TestPandasContainer::test_url[closed_at-datetime64[ns]] 1503s 0.50s teardown tests/io/parser/test_network.py::test_compressed_urls[xz-c-explicit] 1503s 0.50s teardown tests/io/parser/test_network.py::test_compressed_urls[bz2-python-explicit] 1503s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[None-gz_json_responder-read_json] 1503s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[storage_options1-stata_responder-read_stata] 1503s 0.50s teardown tests/io/parser/common/test_file_buffer_url.py::test_url[python] 1503s 0.50s teardown tests/io/parser/test_network.py::test_compressed_urls[zip-python-explicit] 1503s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[storage_options2-json_responder-read_json] 1503s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[storage_options1-gz_csv_responder-read_csv] 1503s 0.50s teardown tests/io/parser/test_network.py::test_compressed_urls[bz2-python-infer] 1503s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[None-json_responder-read_json] 1503s 0.50s teardown tests/io/parser/test_network.py::test_compressed_urls[bz2-c-infer] 1503s 0.50s teardown tests/io/parser/test_network.py::test_compressed_urls[xz-python-infer] 1503s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[storage_options1-gz_json_responder-read_json] 1503s 0.50s teardown tests/io/parser/test_network.py::test_compressed_urls[gzip-c-infer] 1503s 0.50s teardown tests/io/parser/test_network.py::test_compressed_urls[gzip-python-explicit] 1503s 0.50s teardown tests/io/test_http_headers.py::test_request_headers[storage_options2-csv_responder-read_csv] 1503s 0.50s teardown tests/io/parser/test_network.py::test_url_encoding_csv 1503s =========================== short test summary info ============================ 1503s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_spss.py::test_spss_metadata 1503s = 1 failed, 12141 passed, 3015 skipped, 201 deselected, 155 xfailed, 3 warnings in 118.50s (0:01:58) = 1505s + test 1 == 5 1505s + TEST_SUCCESS=false 1505s + echo 'rdjoqkol test state = false' 1505s + for TEST_SUBSET in $modpath/tests/* 1505s + echo /usr/lib/python3/dist-packages/pandas/tests/libs 1505s + grep -q -e __pycache__ 1505s rdjoqkol test state = false 1505s + PANDAS_CI=1 1505s + LC_ALL=C.UTF-8 1505s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/libs 1506s ============================= test session starts ============================== 1506s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1506s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1506s rootdir: /usr/lib/python3/dist-packages/pandas 1506s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1506s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1506s asyncio: mode=Mode.STRICT 1506s collected 2279 items 1506s 1508s ../../../usr/lib/python3/dist-packages/pandas/tests/libs/test_hashtable.py ..............s.....................................................................s.............s.......................................................s.............s.................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s................................................................................................................................................................................................. 1508s ../../../usr/lib/python3/dist-packages/pandas/tests/libs/test_join.py ................. 1508s ../../../usr/lib/python3/dist-packages/pandas/tests/libs/test_lib.py .................................................................................. 1508s ../../../usr/lib/python3/dist-packages/pandas/tests/libs/test_libalgos.py ........ 1508s 1508s =============================== warnings summary =============================== 1508s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1508s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-i1svkgt3' 1508s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1508s 1508s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1508s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-u_0xy_ct' 1508s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1508s 1508s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1508s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1508s ============================= slowest 30 durations ============================= 1508s 0.03s setup tests/libs/test_hashtable.py::TestHashTable::test_no_reallocation[33-UInt32HashTable-uint32] 1508s 0.01s teardown tests/libs/test_libalgos.py::TestInfinity::test_infinity_against_nan 1508s 0.01s call tests/libs/test_libalgos.py::test_groupsort_indexer 1508s 1508s (27 durations < 0.005s hidden. Use -vv to show these durations.) 1508s ================= 2273 passed, 6 skipped, 2 warnings in 2.37s ================== 1508s + echo 'rdjoqkol test state = false' 1508s + for TEST_SUBSET in $modpath/tests/* 1508s + echo /usr/lib/python3/dist-packages/pandas/tests/plotting 1508s + grep -q -e __pycache__ 1508s rdjoqkol test state = false 1508s + PANDAS_CI=1 1508s + LC_ALL=C.UTF-8 1508s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/plotting 1510s ============================= test session starts ============================== 1510s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1510s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1510s rootdir: /usr/lib/python3/dist-packages/pandas 1510s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1510s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1510s asyncio: mode=Mode.STRICT 1510s collected 1423 items / 212 deselected / 1211 selected 1510s 1524s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/frame/test_frame.py ..................................................XX.........................s.s.s.s....................................................................................................x...................................................................................... 1528s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/frame/test_frame_color.py ......................................................................................... 1529s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/frame/test_frame_groupby.py ...... 1530s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/frame/test_frame_legend.py x..................... 1535s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/frame/test_frame_subplots.py .........x....................XX................................................................... 1536s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/frame/test_hist_box_by.py ............................. 1536s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_backend.py .....s. 1538s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_boxplot_method.py .................................................... 1538s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_common.py ... 1539s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_converter.py ............................................ 1551s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_datetimelike.py ...............................................................................................................x..........................................x.......................x...............x..... 1551s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_groupby.py ................. 1555s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_hist_method.py ...........................x..x...................................................... 1561s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_misc.py s....................sss...................sss...................sss.................................. 1564s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_series.py ...............................XXXX.............................x........................................................x......................... 1564s ../../../usr/lib/python3/dist-packages/pandas/tests/plotting/test_style.py ...................................... 1564s 1564s =============================== warnings summary =============================== 1564s tests/plotting/frame/test_frame.py: 11 warnings 1564s /usr/lib/python3/dist-packages/matplotlib/transforms.py:2652: RuntimeWarning: divide by zero encountered in scalar divide 1564s x_scale = 1.0 / inw 1564s 1564s tests/plotting/frame/test_frame.py: 11 warnings 1564s /usr/lib/python3/dist-packages/matplotlib/transforms.py:2654: RuntimeWarning: invalid value encountered in scalar multiply 1564s self._mtx = np.array([[x_scale, 0.0 , (-inl*x_scale)], 1564s 1564s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1564s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-5svio1q5' 1564s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1564s 1564s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1564s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-whtpplfv' 1564s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1564s 1564s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1564s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1564s ============================= slowest 30 durations ============================= 1564s 0.47s call tests/plotting/test_misc.py::TestDataFramePlots::test_scatter_matrix_axis[True] 1564s 0.46s call tests/plotting/test_converter.py::test_registry_mpl_resets 1564s 0.46s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_errorbar_timeseries[barh] 1564s 0.41s call tests/plotting/test_misc.py::TestSeriesPlots::test_bootstrap_plot 1564s 0.39s call tests/plotting/test_misc.py::TestDataFramePlots::test_scatter_matrix_axis_smaller[False] 1564s 0.39s call tests/plotting/test_converter.py::TestRegistration::test_dont_register_by_default 1564s 0.38s call tests/plotting/test_datetimelike.py::TestTSPlot::test_finder_daily 1564s 0.36s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_kde_df 1564s 0.36s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_errorbar_timeseries[line] 1564s 0.36s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_errorbar_timeseries[bar] 1564s 0.36s call tests/plotting/test_misc.py::TestDataFramePlots::test_scatter_matrix_axis_smaller[True] 1564s 0.35s call tests/plotting/test_boxplot_method.py::TestDataFrameGroupByPlots::test_boxplot_legacy3[True-UserWarning-3-layout0] 1564s 0.33s call tests/plotting/frame/test_frame_subplots.py::TestDataFramePlotsSubplots::test_subplots_ts_share_axes 1564s 0.31s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_sharex_and_ax 1564s 0.29s call tests/plotting/test_misc.py::TestDataFramePlots::test_scatter_matrix_axis[False] 1564s 0.26s call tests/plotting/test_misc.py::TestDataFramePlots::test_externally_shared_axes 1564s 0.26s call tests/plotting/frame/test_frame_subplots.py::TestDataFramePlotsSubplots::test_subplots_dup_columns_secondary_y 1564s 0.25s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_sharey_and_ax 1564s 0.24s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_df_gridspec_patterns_vert_horiz 1564s 0.24s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_hist_df 1564s 0.23s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_df_grid_settings 1564s 0.22s call tests/plotting/frame/test_frame_subplots.py::TestDataFramePlotsSubplots::test_subplots_constrained_layout 1564s 0.21s call tests/plotting/test_hist_method.py::TestDataFrameGroupByPlots::test_grouped_hist_legacy_grouped_hist 1564s 0.21s call tests/plotting/test_boxplot_method.py::TestDataFramePlots::test_stacked_boxplot_set_axis 1564s 0.20s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_hist_df_coord[data1] 1564s 0.20s call tests/plotting/test_series.py::TestSeriesPlots::test_line_area_nan_series[index1] 1564s 0.20s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_hist_df_coord[data0] 1564s 0.20s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_invalid_logscale[loglog] 1564s 0.19s call tests/plotting/test_datetimelike.py::TestTSPlot::test_line_plot_inferred_freq[D] 1564s 0.19s call tests/plotting/frame/test_frame.py::TestDataFramePlots::test_table 1564s = 1177 passed, 15 skipped, 212 deselected, 11 xfailed, 8 xpassed, 24 warnings in 55.76s = 1565s rdjoqkol test state = false 1565s + echo 'rdjoqkol test state = false' 1565s + for TEST_SUBSET in $modpath/tests/* 1565s + echo /usr/lib/python3/dist-packages/pandas/tests/reductions 1565s + grep -q -e __pycache__ 1565s + PANDAS_CI=1 1565s + LC_ALL=C.UTF-8 1565s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/reductions 1566s ============================= test session starts ============================== 1566s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1566s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1566s rootdir: /usr/lib/python3/dist-packages/pandas 1566s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1566s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1566s asyncio: mode=Mode.STRICT 1566s collected 542 items 1566s 1566s ../../../usr/lib/python3/dist-packages/pandas/tests/reductions/test_reductions.py .....................................................................................................................................................................................................................................................................................................................................s............................................................................................................................... 1567s ../../../usr/lib/python3/dist-packages/pandas/tests/reductions/test_stat_reductions.py ......................................................................................... 1567s 1567s =============================== warnings summary =============================== 1567s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1567s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-lfdokocm' 1567s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1567s 1567s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1567s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-gy8ydo1n' 1567s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1567s 1567s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1567s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1567s ============================= slowest 30 durations ============================= 1567s 0.28s call tests/reductions/test_stat_reductions.py::TestSeriesStatReductions::test_skew 1567s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_float[float64-True] 1567s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_float[float64-False] 1567s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_int[int64-True] 1567s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_int[int64-False] 1567s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_int[int32-True] 1567s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_int[int32-False] 1567s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_float[float32-True] 1567s 0.01s call tests/reductions/test_reductions.py::TestSeriesReductions::test_sum_overflow_float[float32-False] 1567s 1567s (21 durations < 0.005s hidden. Use -vv to show these durations.) 1567s ================== 541 passed, 1 skipped, 2 warnings in 1.43s ================== 1567s + echo 'rdjoqkol test state = false' 1567s rdjoqkol test state = false 1567s + for TEST_SUBSET in $modpath/tests/* 1567s + echo /usr/lib/python3/dist-packages/pandas/tests/resample 1567s + grep -q -e __pycache__ 1567s + PANDAS_CI=1 1567s + LC_ALL=C.UTF-8 1567s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/resample 1568s ============================= test session starts ============================== 1568s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1568s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1568s rootdir: /usr/lib/python3/dist-packages/pandas 1568s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1568s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1568s asyncio: mode=Mode.STRICT 1568s collected 4179 items 1568s 1571s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_base.py ..............................................................................................................................................................xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx......xxx............................................................................................................................................................................................................................................................................................................................................................................................................................................................. 1583s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_datetime_index.py ................................................................................ssss............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x...x...x...x..........................................................................................................................................................................................................................................................................................................................................................................ss 1586s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_period_index.py .................................................................................................................................................................................................................................................................................................................................................................................................................................x.................................................................................. 1587s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_resample_api.py ..................................................................................................................................................................................................... 1588s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_resampler_grouper.py s................................................... 1588s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_time_grouper.py .........................x...................... 1588s ../../../usr/lib/python3/dist-packages/pandas/tests/resample/test_timedelta.py .......................s 1588s 1588s =============================== warnings summary =============================== 1588s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1588s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-5r7r0zzl' 1588s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1588s 1588s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1588s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-9pv3skqx' 1588s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1588s 1588s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1588s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1588s ============================= slowest 30 durations ============================= 1588s 0.18s call tests/resample/test_datetime_index.py::test_resample_dtype_coercion[s] 1588s 0.11s call tests/resample/test_resample_api.py::test_apply_without_aggregation2 1588s 0.10s call tests/resample/test_base.py::test_resample_interpolate[period_range-pi-_index_start1-_index_end1] 1588s 0.09s call tests/resample/test_period_index.py::TestPeriodIndex::test_quarterly_upsample[D-D-start-NOV] 1588s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-30-s-0.5-Min-1] 1588s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-60-s-1-Min-2] 1588s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-30-s-0.5-Min-3] 1588s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-60-s-1-Min-3] 1588s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-30-s-0.5-Min-2] 1588s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-60-s-1-Min-1] 1588s 0.07s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-3600-s-1-h-1] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-3600-s-1-h-2] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-30-s-0.5-Min-3] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-30-s-0.5-Min-1] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-3600-s-1-h-1] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-21600-s-0.25-D-2] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-86400-s-1-D-2] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-86400-s-1-D-2] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-21600-s-0.25-D-3] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-60-s-1-Min-1] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-30-s-0.5-Min-1] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-21600-s-0.25-D-1] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[s-21600-s-0.25-D-2] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-86400-s-1-D-3] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-21600-s-0.25-D-2] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-43200-s-0.5-D-3] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-43200-s-0.5-D-1] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-60-s-1-Min-2] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[us-21600-s-0.25-D-1] 1588s 0.06s call tests/resample/test_datetime_index.py::test_resample_equivalent_offsets[ms-60-s-1-Min-3] 1588s =========== 4117 passed, 8 skipped, 54 xfailed, 2 warnings in 20.35s =========== 1588s rdjoqkol test state = false 1588s + echo 'rdjoqkol test state = false' 1588s + for TEST_SUBSET in $modpath/tests/* 1588s + echo /usr/lib/python3/dist-packages/pandas/tests/reshape 1588s + grep -q -e __pycache__ 1588s + PANDAS_CI=1 1588s + LC_ALL=C.UTF-8 1588s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/reshape 1590s ============================= test session starts ============================== 1590s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1590s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1590s rootdir: /usr/lib/python3/dist-packages/pandas 1590s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1590s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1590s asyncio: mode=Mode.STRICT 1590s collected 2610 items / 1 deselected / 2609 selected 1590s 1590s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_append.py .................................................................................. 1590s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_append_common.py ...........................sssssssss....................................................... 1590s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_categorical.py ............. 1591s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_concat.py .............................................................................................. 1591s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_dataframe.py ..................... 1591s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_datetimes.py ..................................................................................................x......... 1591s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_empty.py .....................s.....s.....s.....s.....s.......... 1591s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_index.py ............................................................. 1591s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_invalid.py ....... 1591s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_series.py ............. 1591s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/concat/test_sort.py .......... 1592s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_join.py .......s..........................s....................................... 1594s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_merge.py ..............................................................................................................................................................................................................................................................................................................................................................................................................................ssssssss..............................................................................................................................................................................................................................ss..........................................................................................................................................................................................................................................................................ssss........ 1595s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_merge_asof.py .s...................................................................................................s.....................s....s.s.s.s.............sss.. 1595s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_merge_cross.py ................. 1596s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_merge_index_as_string.py ................................................................................ 1596s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_merge_ordered.py ..................... 1596s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/merge/test_multi.py .....s.s.................................. 1597s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_crosstab.py ..................................... 1597s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_cut.py ..................................................................................................................................... 1597s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_from_dummies.py ......................................... 1598s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_get_dummies.py ...................................................................................................................................................ss 1598s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_melt.py ..........................................................s 1600s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_pivot.py ...........................................................................xx................................................................................................. 1600s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_pivot_multilevel.py .......... 1600s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_qcut.py ................................................................................ 1600s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_union_categoricals.py .......................................... 1600s ../../../usr/lib/python3/dist-packages/pandas/tests/reshape/test_util.py ................. 1600s 1600s =============================== warnings summary =============================== 1600s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1600s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-bfqghybr' 1600s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1600s 1600s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1600s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-jkzo503r' 1600s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1600s 1600s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1600s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1600s ============================= slowest 30 durations ============================= 1600s 0.07s call tests/reshape/test_qcut.py::test_single_quantile[False-1-0.0--0.001-0.0] 1600s 0.07s call tests/reshape/test_pivot.py::TestPivotTable::test_pivot_multi_functions 1600s 0.06s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_normalize 1600s 0.06s call tests/reshape/merge/test_join.py::TestJoin::test_full_outer_join 1600s 0.05s call tests/reshape/merge/test_merge.py::test_merge_combinations[True-False-True-False-True-True-outer] 1600s 0.05s call tests/reshape/merge/test_merge.py::TestMerge::test_merge_non_unique_indexes 1600s 0.05s call tests/reshape/test_pivot.py::TestPivotTable::test_margins 1600s 0.04s call tests/reshape/concat/test_concat.py::TestConcatenate::test_concat_order 1600s 0.04s call tests/reshape/test_crosstab.py::TestCrosstab::test_margin_dropna6 1600s 0.04s call tests/reshape/test_crosstab.py::TestCrosstab::test_margin_normalize 1600s 0.04s call tests/reshape/merge/test_join.py::TestJoin::test_right_outer_join 1600s 0.03s call tests/reshape/test_pivot.py::TestPivotTable::test_pivot_timegrouper 1600s 0.03s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_normalize_arrays 1600s 0.03s call tests/reshape/test_crosstab.py::test_categoricals[category-category] 1600s 0.02s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_margins_set_margin_name 1600s 0.02s call tests/reshape/merge/test_multi.py::TestMergeMulti::test_compress_group_combinations 1600s 0.02s call tests/reshape/test_pivot.py::TestPivotTable::test_daily 1600s 0.02s call tests/reshape/test_crosstab.py::test_categoricals[int64-category] 1600s 0.02s call tests/reshape/test_crosstab.py::test_categoricals[category-int64] 1600s 0.02s call tests/reshape/test_qcut.py::test_qcut_binning_issues 1600s 0.02s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_ndarray[tuple] 1600s 0.02s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_ndarray[list] 1600s 0.02s call tests/reshape/test_crosstab.py::test_categoricals[int64-int64] 1600s 0.02s call tests/reshape/test_pivot.py::TestPivotTable::test_pivot_timegrouper_double 1600s 0.02s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_ndarray[array] 1600s 0.02s call tests/reshape/merge/test_join.py::TestJoin::test_join_many_non_unique_index 1600s 0.02s call tests/reshape/test_pivot.py::TestPivotTable::test_pivot_table_margins_name_with_aggfunc_list 1600s 0.02s call tests/reshape/merge/test_merge.py::TestMerge::test_validation 1600s 0.02s call tests/reshape/test_crosstab.py::TestCrosstab::test_crosstab_duplicate_names 1600s 0.02s call tests/reshape/merge/test_join.py::TestJoin::test_left_outer_join 1600s ==== 2561 passed, 45 skipped, 1 deselected, 3 xfailed, 2 warnings in 11.53s ==== 1601s + echo 'rdjoqkol test state = false' 1601s rdjoqkol test state = false 1601s + for TEST_SUBSET in $modpath/tests/* 1601s + echo /usr/lib/python3/dist-packages/pandas/tests/scalar 1601s + grep -q -e __pycache__ 1601s + PANDAS_CI=1 1601s + LC_ALL=C.UTF-8 1601s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/scalar 1603s ============================= test session starts ============================== 1603s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1603s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1603s rootdir: /usr/lib/python3/dist-packages/pandas 1603s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1603s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1603s asyncio: mode=Mode.STRICT 1603s collected 4329 items 1603s 1603s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/interval/test_arithmetic.py ............................................ 1603s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/interval/test_constructors.py ......... 1603s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/interval/test_contains.py ................ 1603s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/interval/test_formats.py . 1603s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/interval/test_interval.py ............................................ 1603s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/interval/test_overlaps.py ................................................................................................................................................................. 1603s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/period/test_arithmetic.py .................................................................................... 1603s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/period/test_asfreq.py ....................... 1604s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/period/test_period.py ............................................................................................................................................................................................................................................................................................................... 1604s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/test_na_scalar.py .....................................................................................ss.....ss.....ss................................................................................................................................................................................ 1604s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/test_nat.py ........................................................................................................................s............s............................................................................................................................................................................................................................ 1604s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timedelta/methods/test_as_unit.py .... 1605s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timedelta/methods/test_round.py ................... 1605s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timedelta/test_arithmetic.py ................................................................................................................................ 1605s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timedelta/test_constructors.py ................................................................................................................................................................................................................................................................................................................... 1605s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timedelta/test_formats.py .............. 1605s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timedelta/test_timedelta.py .................................................................x............ 1605s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_as_unit.py .... 1606s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_normalize.py ................................................................................................................................................................. 1606s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_replace.py ............................................................................................................................ 1607s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_round.py ....................................................................................................................................................................................... 1607s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_timestamp_method.py . 1607s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_to_julian_date.py ..... 1607s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_to_pydatetime.py ....... 1607s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_tz_convert.py ............................................................................... 1607s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/methods/test_tz_localize.py ................................................................................................................................................................................................. 1607s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/test_arithmetic.py ............................................................................................................. 1607s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/test_comparisons.py .............................. 1607s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/test_constructors.py ..................................................................xxx.............................................................. 1608s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/test_formats.py ........................................................................... 1610s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/test_timestamp.py .....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x....................................................................................................................................................................................................................................................................................................................................... 1610s ../../../usr/lib/python3/dist-packages/pandas/tests/scalar/timestamp/test_timezones.py .................... 1610s 1610s =============================== warnings summary =============================== 1610s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1610s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-d3dib2n1' 1610s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1610s 1610s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1610s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-4orjjjiw' 1610s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1610s 1610s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1610s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1610s ============================= slowest 30 durations ============================= 1610s 0.21s call tests/scalar/timestamp/methods/test_round.py::TestTimestampRound::test_round_sanity[floor] 1610s 0.18s call tests/scalar/timestamp/test_timestamp.py::TestTimestampProperties::test_dow_parametric 1610s 0.14s call tests/scalar/timestamp/methods/test_round.py::TestTimestampRound::test_round_sanity[round] 1610s 0.14s call tests/scalar/timestamp/methods/test_round.py::TestTimestampRound::test_round_sanity[ceil] 1610s 0.13s call tests/scalar/timedelta/methods/test_round.py::TestTimedeltaRound::test_round_sanity[round] 1610s 0.13s call tests/scalar/timedelta/methods/test_round.py::TestTimedeltaRound::test_round_sanity[ceil] 1610s 0.12s call tests/scalar/timedelta/methods/test_round.py::TestTimedeltaRound::test_round_sanity[floor] 1610s 0.10s call tests/scalar/timedelta/test_timedelta.py::TestTimedeltas::test_hash_equality_invariance 1610s 0.04s setup tests/scalar/test_na_scalar.py::test_arithmetic_ops[mod-b'a'] 1610s 0.03s teardown tests/scalar/timestamp/test_timezones.py::TestTimestampTZOperations::test_timestamp_timetz_equivalent_with_datetime_tz[zoneinfo.ZoneInfo(key='UTC')] 1610s 0.02s call tests/scalar/timestamp/test_timestamp.py::test_negative_dates 1610s 0.01s call tests/scalar/timestamp/methods/test_tz_localize.py::TestTimestampTZLocalize::test_tz_localize_ambiguous 1610s 0.01s call tests/scalar/test_nat.py::test_nat_vector_field_access 1610s 0.01s call tests/scalar/timestamp/methods/test_tz_localize.py::TestTimestampTZLocalize::test_tz_localize_ambiguous_bool[ns] 1610s 0.01s call tests/scalar/period/test_period.py::TestPeriodMethods::test_to_timestamp 1610s 0.01s call tests/scalar/timestamp/test_constructors.py::TestTimestampConstructorFoldKeyword::test_timestamp_constructor_fold_conflict[1572136200000000000-0] 1610s 0.01s call tests/scalar/timedelta/test_timedelta.py::TestTimedeltas::test_implementation_limits 1610s 0.01s call tests/scalar/period/test_arithmetic.py::TestPeriodArithmetic::test_period_add_offset 1610s 0.01s call tests/scalar/timestamp/methods/test_tz_localize.py::TestTimestampTZLocalize::test_tz_localize_pushes_out_of_bounds 1610s 0.01s call tests/scalar/timestamp/test_constructors.py::TestTimestampConstructors::test_constructor_invalid_tz 1610s 0.01s call tests/scalar/timestamp/test_arithmetic.py::TestTimestampArithmetic::test_overflow_offset_raises 1610s 0.01s call tests/scalar/timestamp/methods/test_round.py::TestTimestampRound::test_round_implementation_bounds 1610s 1610s (8 durations < 0.005s hidden. Use -vv to show these durations.) 1610s ============ 4316 passed, 8 skipped, 5 xfailed, 2 warnings in 8.88s ============ 1610s rdjoqkol test state = false 1610s + echo 'rdjoqkol test state = false' 1610s + for TEST_SUBSET in $modpath/tests/* 1610s + echo /usr/lib/python3/dist-packages/pandas/tests/series 1610s + grep -q -e __pycache__ 1610s + PANDAS_CI=1 1610s + LC_ALL=C.UTF-8 1610s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/series 1614s ============================= test session starts ============================== 1614s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1614s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1614s rootdir: /usr/lib/python3/dist-packages/pandas 1614s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1614s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1614s asyncio: mode=Mode.STRICT 1614s collected 13018 items / 2 skipped 1614s 1614s ../../../usr/lib/python3/dist-packages/pandas/tests/series/accessors/test_cat_accessor.py ................... 1619s ../../../usr/lib/python3/dist-packages/pandas/tests/series/accessors/test_dt_accessor.py ........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 1619s ../../../usr/lib/python3/dist-packages/pandas/tests/series/accessors/test_sparse_accessor.py . 1619s ../../../usr/lib/python3/dist-packages/pandas/tests/series/accessors/test_str_accessor.py .. 1619s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_datetime.py ................. 1619s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_delitem.py .... 1619s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_get.py ............ 1619s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_getitem.py .............................................................................................. 1620s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_indexing.py .......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1620s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_mask.py .... 1620s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_set_value.py ... 1624s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_setitem.py .......................................................................................................................sss....sss....sss....sss....sss....sss....sss....sss....sss....sss....sss....sss...s...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...ssss...sss.................................................................................................................................................................................sssssssss..................ssssssssssssssssss............................................................ssssssssssss..........................................................................................ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss...........................................................................................................................sss..................ssssssssssss...........................sss.......sssssssssssssssssssssssssss.................................................................................sssssssss.................................ssssss.....................sssssssss........................ssssss........................ssssssssssss....................................ssssssssssss..........................................ssssssssssssssssss................................................ssssssssssss....................................ssssssssssss.........................x........x........x........x........x........x........x........x........x.......sssssssssssssssssssssssssss.........sssssssss..............................ssssssssssss.................................sssssssss....................................ssssssssssssssssss................................................................................................................................................................................................... 1624s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_take.py .... 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_where.py ......................................................................................................................................................................................................... 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/indexing/test_xs.py ...... 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_add_prefix_suffix.py ... 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_align.py ............................................................................................................... 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_argsort.py ......... 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_asof.py ....... 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_astype.py ......................................................s....s..........................................................................x........sssssssss.................s 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_autocorr.py . 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_between.py ....... 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_case_when.py ........... 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_clip.py ....s.....s.....s.....s.....s.....s.....s.....s.....s.....s............ 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_combine.py . 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_combine_first.py .............................. 1625s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_compare.py ............ 1627s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_convert_dtypes.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................s......ss 1627s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_copy.py .......... 1627s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_count.py ... 1627s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_cov_corr.py ................ 1627s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_describe.py ...................................................... 1627s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_diff.py ....... 1627s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_drop.py ............................. 1628s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_drop_duplicates.py ..................................................................ssssss.............................................................................................................................................................................................................................................................................................................ss 1628s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_dropna.py ........... 1628s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_dtypes.py . 1628s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_duplicated.py .................. 1628s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_equals.py ..................................................... 1628s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_explode.py ...............ssss 1628s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_fillna.py ....................x.x.x................................................................................................................................. 1628s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_get_numeric_data.py . 1628s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_head_tail.py . 1628s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_infer_objects.py ....... 1628s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_info.py ........x..... 1629s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_interpolate.py x.........................................................................................................................................................................................................x.x............. 1629s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_is_monotonic.py .. 1629s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_is_unique.py ........ 1629s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_isin.py ......................................... 1629s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_isna.py .. 1629s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_item.py . 1630s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_map.py ......ss...................................xxx....................................................... 1630s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_matmul.py . 1630s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_nlargest.py ................................................................. 1630s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_nunique.py .. 1630s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_pct_change.py .............. 1630s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_pop.py . 1630s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_quantile.py ........................................ 1631s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_rank.py .......................................................................................................ssssssssssssssssssssssssssssss........................................................ssssssssssssssss.....ss.....ss.....ss.....ss.....ss.....................................................ssssssssssssssssss.............................................ssssssssssssssssss.............................................ssssssssssssssssss.............................................ssssssssssssssssss.............................................ssssssssssssssssss. 1631s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_reindex.py ................................... 1631s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_reindex_like.py .. 1631s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_rename.py ................ 1631s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_rename_axis.py ..... 1631s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_repeat.py ... 1631s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_replace.py ...........................s.......................................................................... 1631s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_reset_index.py ........s...... 1631s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_round.py ......................................................................................... 1631s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_searchsorted.py ........ 1631s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_set_name.py .. 1631s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_size.py ....... 1631s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_sort_index.py .............................................. 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_sort_values.py .............. 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_to_csv.py ................................... 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_to_dict.py ...... 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_to_frame.py ... 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_to_numpy.py ...s. 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_tolist.py ..........sss 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_truncate.py .... 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_tz_localize.py ................................................................ 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_unique.py ....... 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_unstack.py ....... 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_update.py ....................s..... 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_value_counts.py ................... 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_values.py ... 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/methods/test_view.py .................................................. 1632s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_api.py ................................s.......................................................................................................................................................... 1638s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_arithmetic.py ..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x...............................x............. 1639s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_constructors.py ......................................................................................................................................................................................................................................x.........x............................................................................................s.................................xx.................................................sssssss.s..........ss.......... 1639s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_cumulative.py ....................................... 1639s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_formats.py .................................................. 1640s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_iteration.py ....... 1640s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_logical_ops.py ..........................xs 1640s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_missing.py ...x.. 1640s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_npfuncs.py ....s 1640s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_reductions.py ..............s............... 1640s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_subclass.py ......... 1641s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_ufunc.py .....................................................................xxxx........................................................................................................................................ 1641s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_unary.py .......................... 1641s ../../../usr/lib/python3/dist-packages/pandas/tests/series/test_validate.py ............................ 1641s 1641s =============================== warnings summary =============================== 1641s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1641s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-9m5wm2dv' 1641s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1641s 1641s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1641s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-2v89urgw' 1641s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1641s 1641s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1641s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1641s ============================= slowest 30 durations ============================= 1641s 0.37s call tests/series/methods/test_cov_corr.py::TestSeriesCorr::test_corr[float64] 1641s 0.28s call tests/series/test_formats.py::TestSeriesRepr::test_latex_repr 1641s 0.26s call tests/series/methods/test_rank.py::test_pct_max_many_rows 1641s 0.22s call tests/series/test_arithmetic.py::TestNamePreservation::test_series_ops_name_retention[python-names1-truediv-False-array1] 1641s 0.08s call tests/series/indexing/test_indexing.py::TestSetitemValidation::test_setitem_validation_scalar_int[uint8-indexer2-1.0] 1641s 0.06s call tests/series/methods/test_isin.py::TestSeriesIsIn::test_isin 1641s 0.05s teardown tests/series/test_validate.py::test_validate_bool_args[5.0-drop_duplicates] 1641s 0.02s call tests/series/test_api.py::TestSeriesMisc::test_inspect_getmembers 1641s 0.02s call tests/series/accessors/test_cat_accessor.py::TestCatAccessor::test_dt_accessor_api_for_categorical[idx1] 1641s 0.01s call tests/series/accessors/test_cat_accessor.py::TestCatAccessor::test_dt_accessor_api_for_categorical[idx0] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[shn_MM.UTF-8] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_ambiguous_freq_conversions 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[mk_MK.UTF-8] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[de_DE.UTF-8] 1641s 0.01s call tests/series/methods/test_reset_index.py::TestResetIndex::test_reset_index_level 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[am_ET.UTF-8] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[so_SO.UTF-8] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[hr_HR.UTF-8] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[byn_ER.UTF-8] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[gez_ET.UTF-8@abegede] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[aa_ER.UTF-8] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[agr_PE.UTF-8] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[hu_HU.ISO8859-2] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[el_CY.UTF-8] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[de_IT.ISO8859-1] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[ar_SA.UTF-8] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[gez_ET.UTF-8] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[aa_DJ.ISO8859-1] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[hr_HR.ISO8859-2] 1641s 0.01s call tests/series/accessors/test_dt_accessor.py::TestSeriesDatetimeValues::test_dt_accessor_datetime_name_accessors[gez_ER.UTF-8@abegede] 1641s ========= 12380 passed, 608 skipped, 32 xfailed, 2 warnings in 30.09s ========== 1642s + echo 'rdjoqkol test state = false' 1642s + for TEST_SUBSET in $modpath/tests/* 1642s rdjoqkol test state = false 1642s + echo /usr/lib/python3/dist-packages/pandas/tests/strings 1642s + grep -q -e __pycache__ 1642s + PANDAS_CI=1 1642s + LC_ALL=C.UTF-8 1642s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/strings 1643s ============================= test session starts ============================== 1643s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1643s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1643s rootdir: /usr/lib/python3/dist-packages/pandas 1643s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1643s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1643s asyncio: mode=Mode.STRICT 1643s collected 3604 items 1643s 1647s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_api.py ..ss.................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x.........................................................................................................xx............xx......xx............................xx........................xxxx..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss 1647s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_case_justify.py ..ss...ss.......ssssss...ss....ss..ss...ss..ss..........ssssssssss..ss...ss..ss..ss..ss..ss 1647s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_cat.py .......s.s...s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s...................................... 1648s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_extract.py ..ss..ss....ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ssssss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss...ss..ss..ss..ss....ssss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ssssss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss..ss....ssss..ss..ss................ssssssssssssssss..ss..ss...ss..sss 1648s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_find_replace.py ..ss..............ssssssssssss..ss..ss...........................sss...........................sss..ss..ss...ss........................ssssssssssssssssssssssss..ss......ssssss..ss..ss...ss..ss..ss....ssss..ss..ss..ss..ss..ss....ssss..ss...ss..ss..ss..ss..ss..ss..ss...ss..ss..ss.s.sssss.s.sssss...ss 1648s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_get_dummies.py ..ss...ss. 1648s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_split_partition.py ....ssss....ssss..ss..ss..ss............ssssssss..ss..ss..ss..ss....ssss............ssssssssssss....ssss..ss..ss..ss..ss.....ss..ss..ss.....ss..ss.....ssss....ssss....ssss....ssss....ssss......ssss....ssss..ss..ss......ssss...........ss 1648s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_string_array.py ...............................................................................sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss....ssss....ssss.s 1649s ../../../usr/lib/python3/dist-packages/pandas/tests/strings/test_strings.py ......ss...ss.....ssss..ss................ssssssssssssssss....ssss....ssss..ss...ss.............ssssssssssss............ssssssssssss..ss..ss....ssss....ssss....ssss..ss..........ssssssssss..................ssssssssssssssss......ssssss.........ssssss....ssss....ssss..ss...ss..ss..ss.....ssss..ss....................ss....... 1649s 1649s =============================== warnings summary =============================== 1649s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1649s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-9pphzm7a' 1649s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1649s 1649s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1649s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-rkt5alh2' 1649s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1649s 1649s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1649s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1649s ============================= slowest 30 durations ============================= 1649s 0.05s call tests/strings/test_api.py::test_api_for_categorical[startswith4-object] 1649s 0.04s call tests/strings/test_api.py::test_api_per_method[index-empty0-istitle-category] 1649s 0.01s call tests/strings/test_strings.py::test_empty_str_methods[string[python]] 1649s 0.01s teardown tests/strings/test_strings.py::test_series_str_decode 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data5-names5] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data6-names6] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data7-names7] 1649s 0.01s call tests/strings/test_strings.py::test_empty_str_methods[object] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data2-names2] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data3-names3] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data5-names5] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data6-names6] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data2-names2] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data7-names7] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data3-names3] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data0-names0] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data1-names1] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[string[python]-data4-names4] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_same_as_extract_subject_index[string[python]] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data0-names0] 1649s 0.01s call tests/strings/test_extract.py::test_extract_expand_capture_groups[string[python]] 1649s 0.01s call tests/strings/test_extract.py::test_extractall[string[python]] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data1-names1] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_same_as_extract_subject_index[object] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_no_matches[object-data4-names4] 1649s 0.01s call tests/strings/test_extract.py::test_extract_expand_capture_groups[object] 1649s 0.01s call tests/strings/test_extract.py::test_extractall[object] 1649s 0.01s call tests/strings/test_extract.py::test_extractall_same_as_extract[string[python]] 1649s 1649s (2 durations < 0.005s hidden. Use -vv to show these durations.) 1649s ========== 2704 passed, 887 skipped, 13 xfailed, 2 warnings in 5.87s =========== 1649s + echo 'rdjoqkol test state = false' 1649s + for TEST_SUBSET in $modpath/tests/* 1649s rdjoqkol test state = false 1649s + echo /usr/lib/python3/dist-packages/pandas/tests/test_aggregation.py 1649s + grep -q -e __pycache__ 1649s + PANDAS_CI=1 1649s + LC_ALL=C.UTF-8 1649s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_aggregation.py 1650s ============================= test session starts ============================== 1650s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1650s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1650s rootdir: /usr/lib/python3/dist-packages/pandas 1650s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1650s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1650s asyncio: mode=Mode.STRICT 1650s collected 8 items 1650s 1650s ../../../usr/lib/python3/dist-packages/pandas/tests/test_aggregation.py ........ 1650s 1650s =============================== warnings summary =============================== 1650s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1650s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-xx82bxze' 1650s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1650s 1650s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1650s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-l3tvrs4d' 1650s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1650s 1650s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1650s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1650s ============================= slowest 30 durations ============================= 1650s 1650s (24 durations < 0.005s hidden. Use -vv to show these durations.) 1650s ======================== 8 passed, 2 warnings in 0.12s ========================= 1650s + echo 'rdjoqkol test state = false' 1650s + for TEST_SUBSET in $modpath/tests/* 1650s rdjoqkol test state = false 1650s + echo /usr/lib/python3/dist-packages/pandas/tests/test_algos.py 1650s + grep -q -e __pycache__ 1650s + PANDAS_CI=1 1650s + LC_ALL=C.UTF-8 1650s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_algos.py 1651s ============================= test session starts ============================== 1651s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1651s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1651s rootdir: /usr/lib/python3/dist-packages/pandas 1651s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1651s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1651s asyncio: mode=Mode.STRICT 1651s collected 463 items 1651s 1653s ../../../usr/lib/python3/dist-packages/pandas/tests/test_algos.py ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1653s 1653s =============================== warnings summary =============================== 1653s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1653s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-xw5te8f0' 1653s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1653s 1653s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1653s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-n266j34s' 1653s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1653s 1653s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1653s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1653s ============================= slowest 30 durations ============================= 1653s 1.14s call tests/test_algos.py::TestRank::test_pct_max_many_rows 1653s 0.25s call tests/test_algos.py::TestRank::test_scipy_compat[arr0] 1653s 0.03s call tests/test_algos.py::TestFactorize::test_factorize[string-python-False] 1653s 0.02s call tests/test_algos.py::TestIsin::test_large 1653s 0.01s call tests/test_algos.py::TestDuplicated::test_datetime_likes 1653s 0.01s call tests/test_algos.py::TestUnique::test_object_refcount_bug 1653s 0.01s call tests/test_algos.py::TestIsin::test_same_nan_is_in_large_series 1653s 1653s (23 durations < 0.005s hidden. Use -vv to show these durations.) 1653s ======================= 463 passed, 2 warnings in 2.25s ======================== 1653s + echo 'rdjoqkol test state = false' 1653s + for TEST_SUBSET in $modpath/tests/* 1653s rdjoqkol test state = false 1653s + echo /usr/lib/python3/dist-packages/pandas/tests/test_common.py 1653s + grep -q -e __pycache__ 1653s + PANDAS_CI=1 1653s + LC_ALL=C.UTF-8 1653s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_common.py 1654s ============================= test session starts ============================== 1654s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1654s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1654s rootdir: /usr/lib/python3/dist-packages/pandas 1654s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1654s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1654s asyncio: mode=Mode.STRICT 1654s collected 128 items 1654s 1654s ../../../usr/lib/python3/dist-packages/pandas/tests/test_common.py ...............x.x.............................................................................................................. 1654s 1654s =============================== warnings summary =============================== 1654s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1654s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-a197d_f8' 1654s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1654s 1654s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1654s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-61bqf6vy' 1654s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1654s 1654s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1654s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1654s ============================= slowest 30 durations ============================= 1654s 0.39s call tests/test_common.py::test_bz2_missing_import 1654s 0.31s call tests/test_common.py::test_str_size 1654s 1654s (28 durations < 0.005s hidden. Use -vv to show these durations.) 1654s ================== 126 passed, 2 xfailed, 2 warnings in 1.02s ================== 1655s + echo 'rdjoqkol test state = false' 1655s + for TEST_SUBSET in $modpath/tests/* 1655s + echo /usr/lib/python3/dist-packages/pandas/tests/test_downstream.py 1655s + grep -q -e __pycache__ 1655s rdjoqkol test state = false 1655s + PANDAS_CI=1 1655s + LC_ALL=C.UTF-8 1655s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_downstream.py 1656s ============================= test session starts ============================== 1656s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1656s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1656s rootdir: /usr/lib/python3/dist-packages/pandas 1656s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1656s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1656s asyncio: mode=Mode.STRICT 1656s collected 26 items 1656s 1659s ../../../usr/lib/python3/dist-packages/pandas/tests/test_downstream.py XXX.s..sssss.s..........s. 1659s 1659s =============================== warnings summary =============================== 1659s tests/test_downstream.py::test_dask 1659s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:31: FutureWarning: 1659s Dask dataframe query planning is disabled because dask-expr is not installed. 1659s 1659s You can install it with `pip install dask[dataframe]` or `conda install dask`. 1659s This will raise in a future version. 1659s 1659s warnings.warn(msg, FutureWarning) 1659s 1659s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1659s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-48r0b3c5' 1659s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1659s 1659s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1659s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-ms818fzq' 1659s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1659s 1659s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1659s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1659s ============================= slowest 30 durations ============================= 1659s 1.58s call tests/test_downstream.py::test_oo_optimizable 1659s 1.57s call tests/test_downstream.py::test_oo_optimized_datetime_index_unpickle 1659s 0.10s call tests/test_downstream.py::test_dask 1659s 0.01s call tests/test_downstream.py::test_construct_dask_float_array_int_dtype_match_ndarray 1659s 0.01s call tests/test_downstream.py::test_yaml_dump 1659s 1659s (25 durations < 0.005s hidden. Use -vv to show these durations.) 1659s ============= 15 passed, 8 skipped, 3 xpassed, 3 warnings in 3.72s ============= 1659s rdjoqkol test state = false 1659s + echo 'rdjoqkol test state = false' 1659s + for TEST_SUBSET in $modpath/tests/* 1659s + echo /usr/lib/python3/dist-packages/pandas/tests/test_errors.py 1659s + grep -q -e __pycache__ 1659s + PANDAS_CI=1 1659s + LC_ALL=C.UTF-8 1659s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_errors.py 1660s ============================= test session starts ============================== 1660s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1660s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1660s rootdir: /usr/lib/python3/dist-packages/pandas 1660s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1660s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1660s asyncio: mode=Mode.STRICT 1660s collected 36 items 1660s 1660s ../../../usr/lib/python3/dist-packages/pandas/tests/test_errors.py .................................... 1660s 1660s =============================== warnings summary =============================== 1660s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1660s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-fg2gy7di' 1660s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1660s 1660s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1660s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-sno7h2mu' 1660s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1660s 1660s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1660s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1660s ============================= slowest 30 durations ============================= 1660s 1660s (30 durations < 0.005s hidden. Use -vv to show these durations.) 1660s ======================== 36 passed, 2 warnings in 0.14s ======================== 1660s rdjoqkol test state = false 1660s + echo 'rdjoqkol test state = false' 1660s + for TEST_SUBSET in $modpath/tests/* 1660s + echo /usr/lib/python3/dist-packages/pandas/tests/test_expressions.py 1660s + grep -q -e __pycache__ 1660s + PANDAS_CI=1 1660s + LC_ALL=C.UTF-8 1660s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_expressions.py 1661s ============================= test session starts ============================== 1661s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1661s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1661s rootdir: /usr/lib/python3/dist-packages/pandas 1661s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1661s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1661s asyncio: mode=Mode.STRICT 1661s collected 243 items 1661s 1662s ../../../usr/lib/python3/dist-packages/pandas/tests/test_expressions.py ................................................................................................................................................................................................................................................... 1662s 1662s =============================== warnings summary =============================== 1662s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1662s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-7ebgncw_' 1662s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1662s 1662s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1662s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-1c4y_kfn' 1662s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1662s 1662s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1662s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1662s ============================= slowest 30 durations ============================= 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_invalid 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[floordiv-False-_integer_integers] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[floordiv-False-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[mod-False-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[ne-False-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[floordiv-True-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[le-False-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[mod-False-_integer_integers] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[floordiv-True-_integer_integers] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[gt-False-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[truediv-False-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[le-True-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[ge-False-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[lt-False-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[mod-True-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[eq-False-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[eq-True-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[ge-True-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[truediv-True-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[mod-True-_integer_integers] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[floordiv-True-_frame] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[floordiv-False-_frame] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[add-False-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[gt-True-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[ne-True-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_binary[lt-True-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[sub-False-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[sub-True-_mixed] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[mod-True-_frame] 1662s 0.01s call tests/test_expressions.py::TestExpressions::test_run_arithmetic[mul-False-_mixed] 1662s ======================= 243 passed, 2 warnings in 1.10s ======================== 1662s rdjoqkol test state = false 1662s + echo 'rdjoqkol test state = false' 1662s + for TEST_SUBSET in $modpath/tests/* 1662s + echo /usr/lib/python3/dist-packages/pandas/tests/test_flags.py 1662s + grep -q -e __pycache__ 1662s + PANDAS_CI=1 1662s + LC_ALL=C.UTF-8 1662s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_flags.py 1663s ============================= test session starts ============================== 1663s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1663s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1663s rootdir: /usr/lib/python3/dist-packages/pandas 1663s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1663s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1663s asyncio: mode=Mode.STRICT 1663s collected 5 items 1663s 1663s ../../../usr/lib/python3/dist-packages/pandas/tests/test_flags.py ..... 1663s 1663s =============================== warnings summary =============================== 1663s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1663s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-9djuz382' 1663s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1663s 1663s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1663s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-vcf1ahvs' 1663s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1663s 1663s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1663s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1663s ============================= slowest 30 durations ============================= 1663s 1663s (15 durations < 0.005s hidden. Use -vv to show these durations.) 1663s ======================== 5 passed, 2 warnings in 0.12s ========================= 1663s + echo 'rdjoqkol test state = false' 1663s rdjoqkol test state = false 1663s + for TEST_SUBSET in $modpath/tests/* 1663s + echo /usr/lib/python3/dist-packages/pandas/tests/test_multilevel.py 1663s + grep -q -e __pycache__ 1663s + PANDAS_CI=1 1663s + LC_ALL=C.UTF-8 1663s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_multilevel.py 1663s ============================= test session starts ============================== 1664s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1664s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1664s rootdir: /usr/lib/python3/dist-packages/pandas 1664s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1664s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1664s asyncio: mode=Mode.STRICT 1664s collected 19 items 1664s 1664s ../../../usr/lib/python3/dist-packages/pandas/tests/test_multilevel.py ................... 1664s 1664s =============================== warnings summary =============================== 1664s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1664s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-h_sde0lb' 1664s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1664s 1664s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1664s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-u4091khn' 1664s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1664s 1664s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1664s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1664s ============================= slowest 30 durations ============================= 1664s 0.01s call tests/test_multilevel.py::TestMultiLevel::test_reindex_level 1664s 0.01s call tests/test_multilevel.py::TestMultiLevel::test_alignment 1664s 0.01s call tests/test_multilevel.py::TestMultiLevel::test_reindex_level_partial_selection 1664s 1664s (27 durations < 0.005s hidden. Use -vv to show these durations.) 1664s ======================== 19 passed, 2 warnings in 0.21s ======================== 1664s rdjoqkol test state = false 1664s + echo 'rdjoqkol test state = false' 1664s + for TEST_SUBSET in $modpath/tests/* 1664s + echo /usr/lib/python3/dist-packages/pandas/tests/test_nanops.py 1664s + grep -q -e __pycache__ 1664s + PANDAS_CI=1 1664s + LC_ALL=C.UTF-8 1664s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_nanops.py 1664s ============================= test session starts ============================== 1664s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1664s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1664s rootdir: /usr/lib/python3/dist-packages/pandas 1664s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1664s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1664s asyncio: mode=Mode.STRICT 1664s collected 245 items 1664s 1665s ../../../usr/lib/python3/dist-packages/pandas/tests/test_nanops.py ..................................................................................................................................................................................................................................................... 1665s 1665s =============================== warnings summary =============================== 1665s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1665s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-ts2ml_qi' 1665s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1665s 1665s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1665s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-25mi9bsn' 1665s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1665s 1665s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1665s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1665s ============================= slowest 30 durations ============================= 1665s 0.30s call tests/test_nanops.py::TestnanopsDataFrame::test_nansem[True-0] 1665s 0.02s call tests/test_nanops.py::TestnanopsDataFrame::test_nankurt[True] 1665s 0.02s call tests/test_nanops.py::TestnanopsDataFrame::test_nanskew[True] 1665s 0.02s call tests/test_nanops.py::TestnanopsDataFrame::test_nankurt[False] 1665s 0.02s call tests/test_nanops.py::TestnanopsDataFrame::test_nanskew[False] 1665s 0.02s call tests/test_nanops.py::TestnanopsDataFrame::test_nansem[True-1] 1665s 0.02s call tests/test_nanops.py::TestnanopsDataFrame::test_nansem[True-2] 1665s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanmedian[True] 1665s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nancorr_spearman 1665s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nansem[False-1] 1665s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nansem[False-0] 1665s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nansem[False-2] 1665s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanmedian[False] 1665s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanstd[True-0] 1665s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanstd[True-2] 1665s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanstd[True-1] 1665s 0.01s call tests/test_nanops.py::TestnanopsDataFrame::test_nanvar[True-0] 1665s 1665s (13 durations < 0.005s hidden. Use -vv to show these durations.) 1665s ======================= 245 passed, 2 warnings in 1.02s ======================== 1665s rdjoqkol test state = false 1665s + echo 'rdjoqkol test state = false' 1665s + for TEST_SUBSET in $modpath/tests/* 1665s + echo /usr/lib/python3/dist-packages/pandas/tests/test_optional_dependency.py 1665s + grep -q -e __pycache__ 1665s + PANDAS_CI=1 1665s + LC_ALL=C.UTF-8 1665s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_optional_dependency.py 1666s ============================= test session starts ============================== 1666s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1666s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1666s rootdir: /usr/lib/python3/dist-packages/pandas 1666s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1666s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1666s asyncio: mode=Mode.STRICT 1666s collected 5 items 1666s 1666s ../../../usr/lib/python3/dist-packages/pandas/tests/test_optional_dependency.py ..... 1666s 1666s =============================== warnings summary =============================== 1666s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1666s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-r9ul3lns' 1666s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1666s 1666s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1666s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-3r_sfkfg' 1666s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1666s 1666s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1666s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1666s ============================= slowest 30 durations ============================= 1666s 0.01s call tests/test_optional_dependency.py::test_xlrd_version_fallback 1666s 1666s (14 durations < 0.005s hidden. Use -vv to show these durations.) 1666s ======================== 5 passed, 2 warnings in 0.12s ========================= 1666s rdjoqkol test state = false 1666s + echo 'rdjoqkol test state = false' 1666s + for TEST_SUBSET in $modpath/tests/* 1666s + echo /usr/lib/python3/dist-packages/pandas/tests/test_register_accessor.py 1666s + grep -q -e __pycache__ 1666s + PANDAS_CI=1 1666s + LC_ALL=C.UTF-8 1666s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_register_accessor.py 1667s ============================= test session starts ============================== 1667s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1667s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1667s rootdir: /usr/lib/python3/dist-packages/pandas 1667s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1667s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1667s asyncio: mode=Mode.STRICT 1667s collected 7 items 1667s 1667s ../../../usr/lib/python3/dist-packages/pandas/tests/test_register_accessor.py ....... 1667s 1667s =============================== warnings summary =============================== 1667s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1667s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-xea8vfl5' 1667s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1667s 1667s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1667s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-nngvlwyv' 1667s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1667s 1667s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1667s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1667s ============================= slowest 30 durations ============================= 1667s 1667s (21 durations < 0.005s hidden. Use -vv to show these durations.) 1667s ======================== 7 passed, 2 warnings in 0.12s ========================= 1667s rdjoqkol test state = false 1667s + echo 'rdjoqkol test state = false' 1667s + for TEST_SUBSET in $modpath/tests/* 1667s + echo /usr/lib/python3/dist-packages/pandas/tests/test_sorting.py 1667s + grep -q -e __pycache__ 1667s + PANDAS_CI=1 1667s + LC_ALL=C.UTF-8 1667s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_sorting.py 1668s ============================= test session starts ============================== 1668s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1668s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1668s rootdir: /usr/lib/python3/dist-packages/pandas 1668s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1668s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1668s asyncio: mode=Mode.STRICT 1668s collected 54 items / 15 deselected / 39 selected 1668s 1669s ../../../usr/lib/python3/dist-packages/pandas/tests/test_sorting.py ....................................... 1669s 1669s =============================== warnings summary =============================== 1669s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1669s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-_61qbbor' 1669s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1669s 1669s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1669s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-2iwei_y4' 1669s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1669s 1669s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1669s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1669s ============================= slowest 30 durations ============================= 1669s 0.52s call tests/test_sorting.py::TestSorting::test_int64_overflow_groupby_large_range 1669s 0.19s call tests/test_sorting.py::TestSorting::test_int64_overflow_groupby_large_df_shuffled[mean] 1669s 0.19s call tests/test_sorting.py::TestSorting::test_int64_overflow_groupby_large_df_shuffled[median] 1669s 0.01s call tests/test_sorting.py::TestMerge::test_int64_overflow_outer_merge 1669s 1669s (26 durations < 0.005s hidden. Use -vv to show these durations.) 1669s ================ 39 passed, 15 deselected, 2 warnings in 1.07s ================= 1669s rdjoqkol test state = false 1669s + echo 'rdjoqkol test state = false' 1669s + for TEST_SUBSET in $modpath/tests/* 1669s + echo /usr/lib/python3/dist-packages/pandas/tests/test_take.py 1669s + grep -q -e __pycache__ 1669s + PANDAS_CI=1 1669s + LC_ALL=C.UTF-8 1669s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/test_take.py 1670s ============================= test session starts ============================== 1670s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1670s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1670s rootdir: /usr/lib/python3/dist-packages/pandas 1670s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1670s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1670s asyncio: mode=Mode.STRICT 1670s collected 81 items 1670s 1670s ../../../usr/lib/python3/dist-packages/pandas/tests/test_take.py ................................................................................. 1670s 1670s =============================== warnings summary =============================== 1670s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1670s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-sk4e8pd_' 1670s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1670s 1670s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1670s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-i707qi53' 1670s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1670s 1670s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1670s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1670s ============================= slowest 30 durations ============================= 1670s 1670s (30 durations < 0.005s hidden. Use -vv to show these durations.) 1670s ======================== 81 passed, 2 warnings in 0.22s ======================== 1670s + echo 'rdjoqkol test state = false' 1670s + for TEST_SUBSET in $modpath/tests/* 1670s rdjoqkol test state = false 1670s + echo /usr/lib/python3/dist-packages/pandas/tests/tools 1670s + grep -q -e __pycache__ 1670s + PANDAS_CI=1 1670s + LC_ALL=C.UTF-8 1670s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/tools 1671s ============================= test session starts ============================== 1671s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1671s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1671s rootdir: /usr/lib/python3/dist-packages/pandas 1671s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1671s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1671s asyncio: mode=Mode.STRICT 1671s collected 1510 items 1671s 1673s ../../../usr/lib/python3/dist-packages/pandas/tests/tools/test_to_datetime.py ............................................................................ss....................................................................................................................................................ssssssss....................................................................ss................................................................................................................................................................................................................................................................................xx....ss.ssssss....................................................s........................................................................................................................................................................................................................................................................................................ssssssssss............................ 1674s ../../../usr/lib/python3/dist-packages/pandas/tests/tools/test_to_numeric.py ...s.s....................................................................................................................................................................................................................................................xx.......................................................................................................................ssssss.s.s.................................sss...sss.s....ssss.s.s 1674s ../../../usr/lib/python3/dist-packages/pandas/tests/tools/test_to_time.py ........... 1674s ../../../usr/lib/python3/dist-packages/pandas/tests/tools/test_to_timedelta.py ........................................................................ssssssssssss 1674s 1674s =============================== warnings summary =============================== 1674s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1674s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-9jsg9fmf' 1674s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1674s 1674s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1674s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-4s9a14rl' 1674s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1674s 1674s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1674s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1674s ============================= slowest 30 durations ============================= 1674s 0.05s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[list-%Y%m%d %H:%M:%S-True] 1674s 0.05s call tests/tools/test_to_datetime.py::TestToDatetimeMisc::test_to_datetime_timezone_name 1674s 0.05s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[array-%Y%m%d %H:%M:%S-True] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[array-%Y%m%d %H:%M:%S-None] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[array-None-True] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[array-None-None] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[list-%Y%m%d %H:%M:%S-None] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[deque-None-None] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[deque-None-True] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[deque-%Y%m%d %H:%M:%S-True] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[deque-%Y%m%d %H:%M:%S-None] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[list-None-None] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[list-None-True] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[tuple-None-True] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[tuple-%Y%m%d %H:%M:%S-None] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[tuple-None-None] 1674s 0.04s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[tuple-%Y%m%d %H:%M:%S-True] 1674s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[Index-None-None] 1674s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[Index-None-True] 1674s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[Index-%Y%m%d %H:%M:%S-True] 1674s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache_series[None-True] 1674s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache[Index-%Y%m%d %H:%M:%S-None] 1674s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache_series[None-None] 1674s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache_series[%Y%m%d %H:%M:%S-True] 1674s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_cache_series[%Y%m%d %H:%M:%S-None] 1674s 0.02s call tests/tools/test_to_datetime.py::TestToDatetime::test_to_datetime_fixed_offset 1674s 0.01s call tests/tools/test_to_datetime.py::TestTimeConversionFormats::test_to_datetime_parse_tzname_or_tzoffset[%Y-%m-%d %H:%M:%S %Z-dates0-expected_dates0] 1674s 0.01s teardown tests/tools/test_to_timedelta.py::test_from_timedelta_arrow_dtype[ms] 1674s 1674s (2 durations < 0.005s hidden. Use -vv to show these durations.) 1674s =========== 1440 passed, 66 skipped, 4 xfailed, 2 warnings in 3.36s ============ 1674s + echo 'rdjoqkol test state = false' 1674s rdjoqkol test state = false 1674s + for TEST_SUBSET in $modpath/tests/* 1674s + echo /usr/lib/python3/dist-packages/pandas/tests/tseries 1674s + grep -q -e __pycache__ 1674s + PANDAS_CI=1 1674s + LC_ALL=C.UTF-8 1674s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/tseries 1676s ============================= test session starts ============================== 1676s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1676s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1676s rootdir: /usr/lib/python3/dist-packages/pandas 1676s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1676s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1676s asyncio: mode=Mode.STRICT 1676s collected 5480 items 1676s 1676s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/frequencies/test_freq_code.py ................... 1676s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/frequencies/test_frequencies.py .......... 1677s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/frequencies/test_inference.py ......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1677s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/holiday/test_calendar.py ........ 1677s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/holiday/test_federal.py ... 1677s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/holiday/test_holiday.py ................................................. 1677s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/holiday/test_observance.py ................................. 1677s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_business_day.py ....................... 1678s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_business_hour.py .............................................................................................. 1678s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_business_month.py ..................... 1678s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_business_quarter.py .............................................. 1678s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_business_year.py ................... 1678s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_common.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1678s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_custom_business_day.py ....... 1679s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_custom_business_hour.py ............................ 1679s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_custom_business_month.py .................................................... 1679s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_dst.py .......................... 1679s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_easter.py .......... 1679s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_fiscal.py ............................................................................................................................................. 1679s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_index.py ........................ 1679s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_month.py ............................................................ 1683s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_offsets.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................x..................................................................................................................................................................................................................................................x................................................................................................................................................................................................................................................ 1684s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_offsets_properties.py .. 1684s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_quarter.py ........................................................................................ 1686s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_ticks.py ............................................................................................................ 1686s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_week.py .............................................. 1686s ../../../usr/lib/python3/dist-packages/pandas/tests/tseries/offsets/test_year.py ................................. 1686s 1686s =============================== warnings summary =============================== 1686s tests/tseries/offsets/test_offsets_properties.py::test_on_offset_implementations 1686s /usr/lib/python3/dist-packages/dateutil/zoneinfo/__init__.py:26: UserWarning: I/O error(2): No such file or directory 1686s warnings.warn("I/O error({0}): {1}".format(e.errno, e.strerror)) 1686s 1686s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1686s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-zpxdt05v' 1686s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1686s 1686s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1686s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-ankte5ot' 1686s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1686s 1686s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1686s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1686s ============================= slowest 30 durations ============================= 1686s 0.56s call tests/tseries/offsets/test_offsets_properties.py::test_on_offset_implementations 1686s 0.34s call tests/tseries/offsets/test_ticks.py::test_tick_equality[Micro] 1686s 0.30s call tests/tseries/offsets/test_offsets_properties.py::test_shift_across_dst 1686s 0.21s call tests/tseries/offsets/test_ticks.py::test_tick_equality[Nano] 1686s 0.20s call tests/tseries/offsets/test_ticks.py::test_tick_equality[Milli] 1686s 0.20s call tests/tseries/offsets/test_ticks.py::test_tick_equality[Second] 1686s 0.20s call tests/tseries/offsets/test_ticks.py::test_tick_equality[Minute] 1686s 0.20s call tests/tseries/offsets/test_ticks.py::test_tick_equality[Hour] 1686s 0.09s call tests/tseries/offsets/test_custom_business_day.py::TestCustomBusinessDay::test_calendar 1686s 0.09s call tests/tseries/offsets/test_custom_business_hour.py::TestCustomBusinessHour::test_us_federal_holiday_with_datetime 1686s 0.08s call tests/tseries/offsets/test_ticks.py::test_tick_add_sub[Hour] 1686s 0.08s call tests/tseries/offsets/test_ticks.py::test_tick_add_sub[Micro] 1686s 0.08s call tests/tseries/offsets/test_ticks.py::test_tick_add_sub[Nano] 1686s 0.08s call tests/tseries/offsets/test_ticks.py::test_tick_add_sub[Milli] 1686s 0.08s call tests/tseries/offsets/test_ticks.py::test_tick_add_sub[Second] 1686s 0.08s call tests/tseries/offsets/test_ticks.py::test_tick_add_sub[Minute] 1686s 0.07s call tests/tseries/offsets/test_offsets.py::TestCommon::test_add[Day-pytz.FixedOffset(-300)] 1686s 0.05s call tests/tseries/frequencies/test_inference.py::test_infer_freq_range[7-W-TUE] 1686s 0.02s teardown tests/tseries/offsets/test_year.py::test_add_out_of_pydatetime_range 1686s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[BusinessMonthBegin--2] 1686s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[BYearEnd--2] 1686s 0.02s call tests/tseries/offsets/test_business_month.py::test_apply_index[BusinessMonthBegin--2] 1686s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[BQuarterBegin--2] 1686s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[BYearBegin--2] 1686s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[BQuarterEnd--2] 1686s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[MonthBegin--2] 1686s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[YearEnd--2] 1686s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[QuarterBegin--2] 1686s 0.02s call tests/tseries/offsets/test_fiscal.py::TestFY5253LastOfMonthQuarter::test_offset 1686s 0.02s call tests/tseries/offsets/test_index.py::test_apply_index[YearBegin--2] 1686s ================= 5478 passed, 2 xfailed, 3 warnings in 11.53s ================= 1687s rdjoqkol test state = false 1687s + echo 'rdjoqkol test state = false' 1687s + for TEST_SUBSET in $modpath/tests/* 1687s + echo /usr/lib/python3/dist-packages/pandas/tests/tslibs 1687s + grep -q -e __pycache__ 1687s + PANDAS_CI=1 1687s + LC_ALL=C.UTF-8 1687s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/tslibs 1688s ============================= test session starts ============================== 1688s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1688s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1688s rootdir: /usr/lib/python3/dist-packages/pandas 1688s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1688s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1688s asyncio: mode=Mode.STRICT 1688s collected 1139 items 1688s 1688s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_api.py . 1688s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_array_to_datetime.py ............................................ 1688s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_ccalendar.py ................. 1690s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_conversion.py ...................................................................... 1690s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_fields.py .... 1690s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_libfrequencies.py ............ 1690s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_liboffsets.py .......................................................................... 1690s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_np_datetime.py ........ 1690s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_npy_units.py .. 1690s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_parse_iso8601.py ................................................... 1695s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_parsing.py .................................ssssssssssssssssssssssssssssssssssssssss..sss...............................................................x...x................................. 1695s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_period.py ....................................... 1695s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_resolution.py ................... 1695s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_strptime.py ....... 1695s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_timedeltas.py ......................... 1696s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_timezones.py ....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1696s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_to_offset.py ................................................................................................... 1696s ../../../usr/lib/python3/dist-packages/pandas/tests/tslibs/test_tzconversion.py . 1696s 1696s =============================== warnings summary =============================== 1696s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1696s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-204f4w0m' 1696s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1696s 1696s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1696s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-5ly_5ke7' 1696s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1696s 1696s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1696s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1696s ============================= slowest 30 durations ============================= 1696s 0.26s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly[tzlocal()] 1696s 0.25s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly['dateutil/Asia/Singapore'] 1696s 0.17s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly[datetime.timezone(datetime.timedelta(days=-1, seconds=82800), 'foo')] 1696s 0.15s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly['UTC-02:15'] 1696s 0.15s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly['-02:15'] 1696s 0.15s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly[datetime.timezone(datetime.timedelta(seconds=3600))] 1696s 0.15s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly['+01:15'] 1696s 0.15s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly['UTC+01:15'] 1696s 0.14s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%m %Y-True- ] 1696s 0.13s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly[pytz.FixedOffset(300)] 1696s 0.13s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly[pytz.FixedOffset(-300)] 1696s 0.13s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y%m%d-True--] 1696s 0.13s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y %m %d-True-.] 1696s 0.13s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y %m %d-True--] 1696s 0.13s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y%m%d-True--] 1696s 0.12s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%m %d %Y-True--] 1696s 0.11s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%d %m %Y-True--] 1696s 0.10s call tests/tslibs/test_conversion.py::test_tz_convert_single_matches_tz_convert_hourly['Asia/Tokyo'] 1696s 0.09s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y %m %d-False- ] 1696s 0.09s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y %m %d-False-/] 1696s 0.09s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y %m %d-True- ] 1696s 0.09s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y %m %d-True- ] 1696s 0.09s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y %m %d-False--] 1696s 0.09s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y %m %d-False-.] 1696s 0.09s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y%m%d-True- ] 1696s 0.09s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y %m %d-False-/] 1696s 0.09s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%Y %m %d-True--] 1696s 0.09s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y %m %d-False- ] 1696s 0.09s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%y %m %d-False--] 1696s 0.09s call tests/tslibs/test_parsing.py::test_hypothesis_delimited_date[%m %d %Y-True- ] 1696s =========== 1094 passed, 43 skipped, 2 xfailed, 2 warnings in 8.78s ============ 1696s rdjoqkol test state = false 1696s + echo 'rdjoqkol test state = false' 1696s + for TEST_SUBSET in $modpath/tests/* 1696s + echo /usr/lib/python3/dist-packages/pandas/tests/util 1696s + grep -q -e __pycache__ 1696s + PANDAS_CI=1 1696s + LC_ALL=C.UTF-8 1696s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/util 1697s ============================= test session starts ============================== 1697s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1697s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1697s rootdir: /usr/lib/python3/dist-packages/pandas 1697s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1697s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1697s asyncio: mode=Mode.STRICT 1697s collected 916 items 1697s 1697s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_almost_equal.py .................................................................................................................................................................... 1697s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_attr_equal.py .......................................... 1697s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_categorical_equal.py .......... 1697s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_extension_array_equal.py ..................... 1697s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_frame_equal.py ............................................................................................................... 1698s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_index_equal.py ................................................................ 1698s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_interval_array_equal.py ....... 1698s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_numpy_array_equal.py ......................... 1698s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_produces_warning.py ............................................................................................................................ 1698s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_assert_series_equal.py .............................................................................................. 1698s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_deprecate.py ... 1698s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_deprecate_kwarg.py .............. 1698s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_deprecate_nonkeyword_arguments.py ................... 1698s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_doc.py .... 1698s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_hashing.py ..................................................................................................................................................... 1698s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_numba.py . 1698s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_rewrite_warning.py .......... 1698s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_shares_memory.py .s 1699s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_show_versions.py .... 1699s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_util.py ...sx.. 1699s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_validate_args.py ...... 1699s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_validate_args_and_kwargs.py ...... 1699s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_validate_inclusive.py ........... 1699s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_validate_kwargs.py .................. 1699s 1699s =============================== warnings summary =============================== 1699s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1699s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-3r27qmsv' 1699s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1699s 1699s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1699s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-lg7o3pra' 1699s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1699s 1699s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1699s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1699s ============================= slowest 30 durations ============================= 1699s 0.50s call tests/util/test_show_versions.py::test_show_versions 1699s 0.01s call tests/util/test_hashing.py::test_same_len_hash_collisions[0-7] 1699s 0.01s call tests/util/test_hashing.py::test_same_len_hash_collisions[1-7] 1699s 0.01s call tests/util/test_hashing.py::test_same_len_hash_collisions[0-6] 1699s 0.01s call tests/util/test_hashing.py::test_same_len_hash_collisions[1-6] 1699s 0.01s call tests/util/test_show_versions.py::test_show_versions_console_json 1699s 0.01s call tests/util/test_show_versions.py::test_json_output_match 1699s 1699s (23 durations < 0.005s hidden. Use -vv to show these durations.) 1699s ============ 913 passed, 2 skipped, 1 xfailed, 2 warnings in 2.00s ============= 1699s rdjoqkol test state = false 1699s + echo 'rdjoqkol test state = false' 1699s + for TEST_SUBSET in $modpath/tests/* 1699s + echo /usr/lib/python3/dist-packages/pandas/tests/window 1699s + grep -q -e __pycache__ 1699s + PANDAS_CI=1 1699s + LC_ALL=C.UTF-8 1699s + xvfb-run --auto-servernum '--server-args=-screen 0 1024x768x24' python3.12 -m pytest --tb=long -s -m 'not slow' -c /tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml --deb-data-root-dir=/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests --rootdir=/usr/lib/python3/dist-packages/pandas /usr/lib/python3/dist-packages/pandas/tests/window 1701s ============================= test session starts ============================== 1701s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 1701s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1701s rootdir: /usr/lib/python3/dist-packages/pandas 1701s configfile: ../../../../../tmp/autopkgtest.rcV9Ni/build.Jt5/src/pyproject.toml 1701s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1701s asyncio: mode=Mode.STRICT 1701s collected 10242 items / 536 deselected / 1 skipped / 9706 selected 1701s 1703s ../../../usr/lib/python3/dist-packages/pandas/tests/window/moments/test_moments_consistency_ewm.py ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 1704s ../../../usr/lib/python3/dist-packages/pandas/tests/window/moments/test_moments_consistency_expanding.py ........x.......................x..x..x..x..x..x....................x.......................x..x..x..x..x..x................................................................................................................................................................................................................................................................................ 1705s ../../../usr/lib/python3/dist-packages/pandas/tests/window/moments/test_moments_consistency_rolling.py ..............x..x............................................x..x..x..x..x..x..x..x..x..x..x..x......................................x..x............................................x..x..x..x..x..x..x..x..x..x..x..x................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 1707s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_api.py ......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1707s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_apply.py ...s....sssss..........s..s....................................................... 1708s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_base_indexer.py .................................................................................................................................................................................................................................... 1708s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_cython_aggregations.py ........................................................................ 1712s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_dtypes.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1712s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_ewm.py .......................................................................................................................................................................................................................................ssssssssssss........ssssssssssssssss................ 1714s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_expanding.py ..........x................................................................................................................................................................................................ss....s...................s..s......s............................................................................................. 1714s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_groupby.py ................................................................................................................... 1714s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_numba.py ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 1716s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_pairwise.py ........................................................................................................................................................................................................................................................................................................................ 1718s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling.py ........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 1720s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_functions.py .................................................................................................................................................................................................................................................................................................................................................................................................................................. 1720s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_quantile.py .......................................................................................................................................................................................... 1720s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_skew_kurt.py .................................................................... 1721s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_timeseries_window.py ..................................................................................s 1721s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_win_type.py ............................................................................................................................................................................................................................................................................................... 1721s 1721s =============================== warnings summary =============================== 1721s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 1721s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-8mqfh52l' 1721s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 1721s 1721s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 1721s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/pytest-cache-files-1xbt_n3d' 1721s session.config.cache.set(STEPWISE_CACHE_DIR, []) 1721s 1721s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 1721s -- generated xml file: /tmp/autopkgtest.rcV9Ni/autopkgtest_tmp/test-data.xml --- 1721s ============================= slowest 30 durations ============================= 1721s 0.28s call tests/window/test_rolling_functions.py::test_rolling_functions_window_non_shrinkage[14] 1721s 0.18s call tests/window/test_timeseries_window.py::TestRollingTS::test_all2[std] 1721s 0.13s call tests/window/test_expanding.py::test_iter_expanding_dataframe[df5-expected5-3] 1721s 0.12s call tests/window/test_apply.py::test_time_rule_frame[False] 1721s 0.09s call tests/window/test_dtypes.py::test_series_dtypes[category-None-mean-data13-expected_data13-True-None] 1721s 0.08s call tests/window/test_apply.py::test_frame[False] 1721s 0.07s call tests/window/moments/test_moments_consistency_expanding.py::test_moments_consistency_var[all_data16-0-0] 1721s 0.05s teardown tests/window/test_win_type.py::test_rolling_center_axis_1 1721s 0.05s call tests/window/test_expanding.py::test_expanding_corr_pairwise 1721s 0.05s call tests/window/test_expanding.py::test_expanding_cov_pairwise 1721s 0.04s call tests/window/test_apply.py::test_center_reindex_frame[False] 1721s 0.03s call tests/window/test_apply.py::test_min_periods[False-None-0] 1721s 0.03s call tests/window/test_apply.py::test_min_periods[False-1-0] 1721s 0.03s call tests/window/test_apply.py::test_center_reindex_series[False] 1721s 0.02s call tests/window/test_pairwise.py::test_rolling_pairwise_cov_corr[corr] 1721s 0.02s call tests/window/test_pairwise.py::test_rolling_pairwise_cov_corr[cov] 1721s 0.02s call tests/window/test_ewm.py::test_ewm_pairwise_cov_corr[corr] 1721s 0.02s call tests/window/test_ewm.py::test_ewm_pairwise_cov_corr[cov] 1721s 0.02s call tests/window/test_pairwise.py::TestPairwise::test_cov_mulittindex 1721s 0.02s call tests/window/test_apply.py::test_nans[False] 1721s 0.02s call tests/window/test_pairwise.py::test_flex_binary_frame[corr] 1721s 0.02s call tests/window/test_pairwise.py::test_flex_binary_frame[cov] 1721s 0.02s call tests/window/test_expanding.py::test_expanding_corr_pairwise_diff_length 1721s 0.02s call tests/window/test_expanding.py::test_expanding_cov_pairwise_diff_length 1721s 0.02s call tests/window/test_apply.py::test_min_periods[False-2-0] 1721s 0.01s call tests/window/test_api.py::test_agg[1] 1721s 0.01s call tests/window/test_groupby.py::TestExpanding::test_expanding_corr_cov[cov] 1721s 0.01s call tests/window/test_api.py::test_agg[None] 1721s 0.01s call tests/window/test_api.py::test_agg[2] 1721s 0.01s call tests/window/test_rolling.py::test_multi_index_names 1721s == 9014 passed, 650 skipped, 536 deselected, 43 xfailed, 2 warnings in 21.71s == 1722s rdjoqkol test state = false 1722s + echo 'rdjoqkol test state = false' 1722s + false 1722s autopkgtest [00:50:43]: test unittests3: -----------------------] 1723s unittests3 FAIL non-zero exit status 1 1723s autopkgtest [00:50:44]: test unittests3: - - - - - - - - - - results - - - - - - - - - - 1723s autopkgtest [00:50:44]: test ignoredtests: preparing testbed 1725s Note, using file '/tmp/autopkgtest.rcV9Ni/5-autopkgtest-satdep.dsc' to get the build dependencies 1725s Reading package lists... 1725s Building dependency tree... 1725s Reading state information... 1726s Starting pkgProblemResolver with broken count: 0 1726s Starting 2 pkgProblemResolver with broken count: 0 1726s Done 1726s The following NEW packages will be installed: 1726s libpq5 python3-psycopg2 python3-pymysql 1726s 0 upgraded, 3 newly installed, 0 to remove and 0 not upgraded. 1726s Need to get 435 kB of archives. 1726s After this operation, 2096 kB of additional disk space will be used. 1726s Get:1 http://ftpmaster.internal/ubuntu plucky/main amd64 libpq5 amd64 17.0-1 [249 kB] 1726s Get:2 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-psycopg2 amd64 2.9.9-2build1 [146 kB] 1726s Get:3 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-pymysql all 1.1.1-1ubuntu1 [39.4 kB] 1727s Fetched 435 kB in 0s (875 kB/s) 1727s Selecting previously unselected package libpq5:amd64. 1727s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 98795 files and directories currently installed.) 1727s Preparing to unpack .../libpq5_17.0-1_amd64.deb ... 1727s Unpacking libpq5:amd64 (17.0-1) ... 1727s Selecting previously unselected package python3-psycopg2. 1727s Preparing to unpack .../python3-psycopg2_2.9.9-2build1_amd64.deb ... 1727s Unpacking python3-psycopg2 (2.9.9-2build1) ... 1727s Selecting previously unselected package python3-pymysql. 1727s Preparing to unpack .../python3-pymysql_1.1.1-1ubuntu1_all.deb ... 1727s Unpacking python3-pymysql (1.1.1-1ubuntu1) ... 1727s Setting up libpq5:amd64 (17.0-1) ... 1727s Setting up python3-psycopg2 (2.9.9-2build1) ... 1727s Setting up python3-pymysql (1.1.1-1ubuntu1) ... 1727s Processing triggers for libc-bin (2.40-1ubuntu3) ... 1728s Reading package lists... 1728s Building dependency tree... 1728s Reading state information... 1729s Starting pkgProblemResolver with broken count: 0 1729s Starting 2 pkgProblemResolver with broken count: 0 1729s Done 1729s The following NEW packages will be installed: 1729s autopkgtest-satdep 1729s 0 upgraded, 1 newly installed, 0 to remove and 0 not upgraded. 1729s Need to get 0 B/696 B of archives. 1729s After this operation, 0 B of additional disk space will be used. 1729s Get:1 /tmp/autopkgtest.rcV9Ni/6-autopkgtest-satdep.deb autopkgtest-satdep amd64 0 [696 B] 1729s Selecting previously unselected package autopkgtest-satdep. 1729s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 98867 files and directories currently installed.) 1729s Preparing to unpack .../6-autopkgtest-satdep.deb ... 1729s Unpacking autopkgtest-satdep (0) ... 1729s Setting up autopkgtest-satdep (0) ... 1730s autopkgtest: WARNING: package python3-pandas:i386 is not installed though it should be 1731s (Reading database ... 98867 files and directories currently installed.) 1731s Removing autopkgtest-satdep (0) ... 1732s autopkgtest [00:50:53]: test ignoredtests: [----------------------- 1732s === python3.13 === 1733s tests that use numba (may crash on non-x86) - checked with grep -rl -e numba pandas/tests - -m not slow because there are enough to time out otherwise 1736s ============================= test session starts ============================== 1736s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 1736s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 1736s rootdir: /usr/lib/python3/dist-packages/pandas/tests 1736s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 1736s asyncio: mode=Mode.STRICT 1736s collected 10446 items / 536 deselected / 2 skipped / 9910 selected 1736s 1738s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_ufunc.py ....xx.........xxxxxxxx.xx....s. 1739s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_timegrouper.py ..............................s 1740s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/transform/test_numba.py sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 1741s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_numba.py sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 1741s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_numba.py . 1764s ../../../usr/lib/python3/dist-packages/pandas/tests/window/moments/test_moments_consistency_ewm.py ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 1773s ../../../usr/lib/python3/dist-packages/pandas/tests/window/moments/test_moments_consistency_expanding.py ........x.......................x..x..x..x..x..x....................x.......................x..x..x..x..x..x................................................................................................................................................................................................................................................................................ 1791s ../../../usr/lib/python3/dist-packages/pandas/tests/window/moments/test_moments_consistency_rolling.py ..............x..x............................................x..x..x..x..x..x..x..x..x..x..x..x......................................x..x............................................x..x..x..x..x..x..x..x..x..x..x..x................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 1812s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_api.py ......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1814s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_apply.py ...s....sssss..........s..s....................................................... 1819s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_base_indexer.py .................................................................................................................................................................................................................................... 1821s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_cython_aggregations.py ........................................................................ 1874s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_dtypes.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 1880s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_ewm.py .......................................................................................................................................................................................................................................ssssssssssss........ssssssssssssssss................ 1887s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_expanding.py ..........x................................................................................................................................................................................................ss....s...................s..s......s............................................................................................. 1889s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_groupby.py ................................................................................................................... 1897s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_numba.py ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 1905s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_pairwise.py ........................................................................................................................................................................................................................................................................................................................ 1924s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling.py ........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 1934s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_functions.py .................................................................................................................................................................................................................................................................................................................................................................................................................................. 1940s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_quantile.py .......................................................................................................................................................................................... 1945s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_skew_kurt.py .................................................................... 1948s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_timeseries_window.py ..................................................................................s 2036s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_win_type.py ............................................................................................................................................................................................................................................................................................... 2036s 2036s =============================== warnings summary =============================== 2036s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_numba.py:11 2036s /usr/lib/python3/dist-packages/pandas/tests/groupby/test_numba.py:11: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2036s pytestmark = pytest.mark.single_cpu 2036s 2036s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_timegrouper.py:944 2036s /usr/lib/python3/dist-packages/pandas/tests/groupby/test_timegrouper.py:944: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2036s @pytest.mark.single_cpu 2036s 2036s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/transform/test_numba.py:14 2036s /usr/lib/python3/dist-packages/pandas/tests/groupby/transform/test_numba.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2036s pytestmark = pytest.mark.single_cpu 2036s 2036s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_numba.py:22 2036s /usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_numba.py:22: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2036s pytestmark = pytest.mark.single_cpu 2036s 2036s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_numba.py:230 2036s /usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_numba.py:230: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2036s @pytest.mark.single_cpu 2036s 2036s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_numba.py:21 2036s /usr/lib/python3/dist-packages/pandas/tests/window/test_numba.py:21: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2036s pytestmark = pytest.mark.single_cpu 2036s 2036s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_numba.py:326 2036s /usr/lib/python3/dist-packages/pandas/tests/window/test_numba.py:326: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2036s @pytest.mark.slow 2036s 2036s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_online.py:11 2036s /usr/lib/python3/dist-packages/pandas/tests/window/test_online.py:11: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2036s pytestmark = pytest.mark.single_cpu 2036s 2036s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_skew_kurt.py:155 2036s /usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_skew_kurt.py:155: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2036s @pytest.mark.slow 2036s 2036s frame/test_ufunc.py: 32 warnings 2036s groupby/test_timegrouper.py: 31 warnings 2036s groupby/transform/test_numba.py: 53 warnings 2036s groupby/aggregate/test_numba.py: 35 warnings 2036s util/test_numba.py: 1 warning 2036s window/moments/test_moments_consistency_ewm.py: 1088 warnings 2036s window/moments/test_moments_consistency_expanding.py: 380 warnings 2036s window/moments/test_moments_consistency_rolling.py: 760 warnings 2036s window/test_api.py: 937 warnings 2036s window/test_apply.py: 82 warnings 2036s window/test_base_indexer.py: 228 warnings 2036s window/test_cython_aggregations.py: 72 warnings 2036s window/test_dtypes.py: 2580 warnings 2036s window/test_ewm.py: 283 warnings 2036s window/test_expanding.py: 333 warnings 2036s window/test_groupby.py: 115 warnings 2036s window/test_numba.py: 51 warnings 2036s window/test_pairwise.py: 312 warnings 2036s window/test_rolling.py: 888 warnings 2036s window/test_rolling_functions.py: 418 warnings 2036s window/test_rolling_quantile.py: 186 warnings 2036s window/test_rolling_skew_kurt.py: 68 warnings 2036s window/test_timeseries_window.py: 83 warnings 2036s window/test_win_type.py: 287 warnings 2036s /usr/lib/python3/dist-packages/py/_process/forkedfunc.py:45: DeprecationWarning: This process (pid=9624) is multi-threaded, use of fork() may lead to deadlocks in the child. 2036s pid = os.fork() 2036s 2036s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 2036s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/tests/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/tests/pytest-cache-files-hfojnps6' 2036s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 2036s 2036s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 2036s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/tests/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/tests/pytest-cache-files-iw4rzc0t' 2036s session.config.cache.set(STEPWISE_CACHE_DIR, []) 2036s 2036s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 2036s = 9064 passed, 793 skipped, 536 deselected, 55 xfailed, 9314 warnings in 301.61s (0:05:01) = 2037s tests with a run=False xfail for hdf5 crashes - see xfail_tests_nonintel_io.patch 2039s ============================= test session starts ============================== 2039s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 2039s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 2039s rootdir: /usr/lib/python3/dist-packages/pandas/tests/io/pytables 2039s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 2039s asyncio: mode=Mode.STRICT 2039s collected 278 items 2039s 2047s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py ...................................................FFFFFFFFF................................................................................................................................... 2049s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_append.py ....F................ 2053s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_store.py ............................................F..................... 2053s 2053s =================================== FAILURES =================================== 2053s ___________________________ test_complibs[blosc2-1] ____________________________ 2053s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-55/test_complibs_blosc2_1_0') 2053s lvl = 1, lib = 'blosc2' 2053s request = > 2053s 2053s @pytest.mark.parametrize("lvl", range(10)) 2053s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2053s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2053s @pytest.mark.skipif( 2053s not PY311 and is_ci_environment() and is_platform_linux(), 2053s reason="Segfaulting in a CI environment" 2053s # with xfail, would sometimes raise UnicodeDecodeError 2053s # invalid state byte 2053s ) 2053s def test_complibs(tmp_path, lvl, lib, request): 2053s # GH14478 2053s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2053s request.applymarker( 2053s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2053s ) 2053s df = DataFrame( 2053s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2053s ) 2053s 2053s # Remove lzo if its not available on this platform 2053s if not tables.which_lib_version("lzo"): 2053s pytest.skip("lzo not available") 2053s # Remove bzip2 if its not available on this platform 2053s if not tables.which_lib_version("bzip2"): 2053s pytest.skip("bzip2 not available") 2053s 2053s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2053s gname = f"{lvl}_{lib}" 2053s 2053s # Write and read file to see if data is consistent 2053s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2053s result = read_hdf(tmpfile, gname) 2053s tm.assert_frame_equal(result, df) 2053s 2053s # Open file and check metadata for correct amount of compression 2053s with tables.open_file(tmpfile, mode="r") as h5table: 2053s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2053s assert node.filters.complevel == lvl 2053s if lvl == 0: 2053s assert node.filters.complib is None 2053s else: 2053s > assert node.filters.complib == lib 2053s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2053s E 2053s E - blosc2 2053s E + blosc2:blosclz 2053s 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2053s ___________________________ test_complibs[blosc2-2] ____________________________ 2053s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-56/test_complibs_blosc2_2_0') 2053s lvl = 2, lib = 'blosc2' 2053s request = > 2053s 2053s @pytest.mark.parametrize("lvl", range(10)) 2053s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2053s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2053s @pytest.mark.skipif( 2053s not PY311 and is_ci_environment() and is_platform_linux(), 2053s reason="Segfaulting in a CI environment" 2053s # with xfail, would sometimes raise UnicodeDecodeError 2053s # invalid state byte 2053s ) 2053s def test_complibs(tmp_path, lvl, lib, request): 2053s # GH14478 2053s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2053s request.applymarker( 2053s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2053s ) 2053s df = DataFrame( 2053s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2053s ) 2053s 2053s # Remove lzo if its not available on this platform 2053s if not tables.which_lib_version("lzo"): 2053s pytest.skip("lzo not available") 2053s # Remove bzip2 if its not available on this platform 2053s if not tables.which_lib_version("bzip2"): 2053s pytest.skip("bzip2 not available") 2053s 2053s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2053s gname = f"{lvl}_{lib}" 2053s 2053s # Write and read file to see if data is consistent 2053s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2053s result = read_hdf(tmpfile, gname) 2053s tm.assert_frame_equal(result, df) 2053s 2053s # Open file and check metadata for correct amount of compression 2053s with tables.open_file(tmpfile, mode="r") as h5table: 2053s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2053s assert node.filters.complevel == lvl 2053s if lvl == 0: 2053s assert node.filters.complib is None 2053s else: 2053s > assert node.filters.complib == lib 2053s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2053s E 2053s E - blosc2 2053s E + blosc2:blosclz 2053s 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2053s ___________________________ test_complibs[blosc2-3] ____________________________ 2053s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-57/test_complibs_blosc2_3_0') 2053s lvl = 3, lib = 'blosc2' 2053s request = > 2053s 2053s @pytest.mark.parametrize("lvl", range(10)) 2053s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2053s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2053s @pytest.mark.skipif( 2053s not PY311 and is_ci_environment() and is_platform_linux(), 2053s reason="Segfaulting in a CI environment" 2053s # with xfail, would sometimes raise UnicodeDecodeError 2053s # invalid state byte 2053s ) 2053s def test_complibs(tmp_path, lvl, lib, request): 2053s # GH14478 2053s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2053s request.applymarker( 2053s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2053s ) 2053s df = DataFrame( 2053s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2053s ) 2053s 2053s # Remove lzo if its not available on this platform 2053s if not tables.which_lib_version("lzo"): 2053s pytest.skip("lzo not available") 2053s # Remove bzip2 if its not available on this platform 2053s if not tables.which_lib_version("bzip2"): 2053s pytest.skip("bzip2 not available") 2053s 2053s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2053s gname = f"{lvl}_{lib}" 2053s 2053s # Write and read file to see if data is consistent 2053s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2053s result = read_hdf(tmpfile, gname) 2053s tm.assert_frame_equal(result, df) 2053s 2053s # Open file and check metadata for correct amount of compression 2053s with tables.open_file(tmpfile, mode="r") as h5table: 2053s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2053s assert node.filters.complevel == lvl 2053s if lvl == 0: 2053s assert node.filters.complib is None 2053s else: 2053s > assert node.filters.complib == lib 2053s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2053s E 2053s E - blosc2 2053s E + blosc2:blosclz 2053s 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2053s ___________________________ test_complibs[blosc2-4] ____________________________ 2053s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-58/test_complibs_blosc2_4_0') 2053s lvl = 4, lib = 'blosc2' 2053s request = > 2053s 2053s @pytest.mark.parametrize("lvl", range(10)) 2053s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2053s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2053s @pytest.mark.skipif( 2053s not PY311 and is_ci_environment() and is_platform_linux(), 2053s reason="Segfaulting in a CI environment" 2053s # with xfail, would sometimes raise UnicodeDecodeError 2053s # invalid state byte 2053s ) 2053s def test_complibs(tmp_path, lvl, lib, request): 2053s # GH14478 2053s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2053s request.applymarker( 2053s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2053s ) 2053s df = DataFrame( 2053s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2053s ) 2053s 2053s # Remove lzo if its not available on this platform 2053s if not tables.which_lib_version("lzo"): 2053s pytest.skip("lzo not available") 2053s # Remove bzip2 if its not available on this platform 2053s if not tables.which_lib_version("bzip2"): 2053s pytest.skip("bzip2 not available") 2053s 2053s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2053s gname = f"{lvl}_{lib}" 2053s 2053s # Write and read file to see if data is consistent 2053s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2053s result = read_hdf(tmpfile, gname) 2053s tm.assert_frame_equal(result, df) 2053s 2053s # Open file and check metadata for correct amount of compression 2053s with tables.open_file(tmpfile, mode="r") as h5table: 2053s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2053s assert node.filters.complevel == lvl 2053s if lvl == 0: 2053s assert node.filters.complib is None 2053s else: 2053s > assert node.filters.complib == lib 2053s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2053s E 2053s E - blosc2 2053s E + blosc2:blosclz 2053s 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2053s ___________________________ test_complibs[blosc2-5] ____________________________ 2053s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-59/test_complibs_blosc2_5_0') 2053s lvl = 5, lib = 'blosc2' 2053s request = > 2053s 2053s @pytest.mark.parametrize("lvl", range(10)) 2053s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2053s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2053s @pytest.mark.skipif( 2053s not PY311 and is_ci_environment() and is_platform_linux(), 2053s reason="Segfaulting in a CI environment" 2053s # with xfail, would sometimes raise UnicodeDecodeError 2053s # invalid state byte 2053s ) 2053s def test_complibs(tmp_path, lvl, lib, request): 2053s # GH14478 2053s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2053s request.applymarker( 2053s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2053s ) 2053s df = DataFrame( 2053s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2053s ) 2053s 2053s # Remove lzo if its not available on this platform 2053s if not tables.which_lib_version("lzo"): 2053s pytest.skip("lzo not available") 2053s # Remove bzip2 if its not available on this platform 2053s if not tables.which_lib_version("bzip2"): 2053s pytest.skip("bzip2 not available") 2053s 2053s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2053s gname = f"{lvl}_{lib}" 2053s 2053s # Write and read file to see if data is consistent 2053s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2053s result = read_hdf(tmpfile, gname) 2053s tm.assert_frame_equal(result, df) 2053s 2053s # Open file and check metadata for correct amount of compression 2053s with tables.open_file(tmpfile, mode="r") as h5table: 2053s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2053s assert node.filters.complevel == lvl 2053s if lvl == 0: 2053s assert node.filters.complib is None 2053s else: 2053s > assert node.filters.complib == lib 2053s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2053s E 2053s E - blosc2 2053s E + blosc2:blosclz 2053s 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2053s ___________________________ test_complibs[blosc2-6] ____________________________ 2053s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-60/test_complibs_blosc2_6_0') 2053s lvl = 6, lib = 'blosc2' 2053s request = > 2053s 2053s @pytest.mark.parametrize("lvl", range(10)) 2053s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2053s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2053s @pytest.mark.skipif( 2053s not PY311 and is_ci_environment() and is_platform_linux(), 2053s reason="Segfaulting in a CI environment" 2053s # with xfail, would sometimes raise UnicodeDecodeError 2053s # invalid state byte 2053s ) 2053s def test_complibs(tmp_path, lvl, lib, request): 2053s # GH14478 2053s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2053s request.applymarker( 2053s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2053s ) 2053s df = DataFrame( 2053s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2053s ) 2053s 2053s # Remove lzo if its not available on this platform 2053s if not tables.which_lib_version("lzo"): 2053s pytest.skip("lzo not available") 2053s # Remove bzip2 if its not available on this platform 2053s if not tables.which_lib_version("bzip2"): 2053s pytest.skip("bzip2 not available") 2053s 2053s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2053s gname = f"{lvl}_{lib}" 2053s 2053s # Write and read file to see if data is consistent 2053s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2053s result = read_hdf(tmpfile, gname) 2053s tm.assert_frame_equal(result, df) 2053s 2053s # Open file and check metadata for correct amount of compression 2053s with tables.open_file(tmpfile, mode="r") as h5table: 2053s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2053s assert node.filters.complevel == lvl 2053s if lvl == 0: 2053s assert node.filters.complib is None 2053s else: 2053s > assert node.filters.complib == lib 2053s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2053s E 2053s E - blosc2 2053s E + blosc2:blosclz 2053s 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2053s ___________________________ test_complibs[blosc2-7] ____________________________ 2053s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-61/test_complibs_blosc2_7_0') 2053s lvl = 7, lib = 'blosc2' 2053s request = > 2053s 2053s @pytest.mark.parametrize("lvl", range(10)) 2053s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2053s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2053s @pytest.mark.skipif( 2053s not PY311 and is_ci_environment() and is_platform_linux(), 2053s reason="Segfaulting in a CI environment" 2053s # with xfail, would sometimes raise UnicodeDecodeError 2053s # invalid state byte 2053s ) 2053s def test_complibs(tmp_path, lvl, lib, request): 2053s # GH14478 2053s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2053s request.applymarker( 2053s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2053s ) 2053s df = DataFrame( 2053s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2053s ) 2053s 2053s # Remove lzo if its not available on this platform 2053s if not tables.which_lib_version("lzo"): 2053s pytest.skip("lzo not available") 2053s # Remove bzip2 if its not available on this platform 2053s if not tables.which_lib_version("bzip2"): 2053s pytest.skip("bzip2 not available") 2053s 2053s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2053s gname = f"{lvl}_{lib}" 2053s 2053s # Write and read file to see if data is consistent 2053s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2053s result = read_hdf(tmpfile, gname) 2053s tm.assert_frame_equal(result, df) 2053s 2053s # Open file and check metadata for correct amount of compression 2053s with tables.open_file(tmpfile, mode="r") as h5table: 2053s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2053s assert node.filters.complevel == lvl 2053s if lvl == 0: 2053s assert node.filters.complib is None 2053s else: 2053s > assert node.filters.complib == lib 2053s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2053s E 2053s E - blosc2 2053s E + blosc2:blosclz 2053s 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2053s ___________________________ test_complibs[blosc2-8] ____________________________ 2053s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-62/test_complibs_blosc2_8_0') 2053s lvl = 8, lib = 'blosc2' 2053s request = > 2053s 2053s @pytest.mark.parametrize("lvl", range(10)) 2053s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2053s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2053s @pytest.mark.skipif( 2053s not PY311 and is_ci_environment() and is_platform_linux(), 2053s reason="Segfaulting in a CI environment" 2053s # with xfail, would sometimes raise UnicodeDecodeError 2053s # invalid state byte 2053s ) 2053s def test_complibs(tmp_path, lvl, lib, request): 2053s # GH14478 2053s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2053s request.applymarker( 2053s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2053s ) 2053s df = DataFrame( 2053s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2053s ) 2053s 2053s # Remove lzo if its not available on this platform 2053s if not tables.which_lib_version("lzo"): 2053s pytest.skip("lzo not available") 2053s # Remove bzip2 if its not available on this platform 2053s if not tables.which_lib_version("bzip2"): 2053s pytest.skip("bzip2 not available") 2053s 2053s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2053s gname = f"{lvl}_{lib}" 2053s 2053s # Write and read file to see if data is consistent 2053s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2053s result = read_hdf(tmpfile, gname) 2053s tm.assert_frame_equal(result, df) 2053s 2053s # Open file and check metadata for correct amount of compression 2053s with tables.open_file(tmpfile, mode="r") as h5table: 2053s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2053s assert node.filters.complevel == lvl 2053s if lvl == 0: 2053s assert node.filters.complib is None 2053s else: 2053s > assert node.filters.complib == lib 2053s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2053s E 2053s E - blosc2 2053s E + blosc2:blosclz 2053s 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2053s ___________________________ test_complibs[blosc2-9] ____________________________ 2053s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-63/test_complibs_blosc2_9_0') 2053s lvl = 9, lib = 'blosc2' 2053s request = > 2053s 2053s @pytest.mark.parametrize("lvl", range(10)) 2053s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2053s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2053s @pytest.mark.skipif( 2053s not PY311 and is_ci_environment() and is_platform_linux(), 2053s reason="Segfaulting in a CI environment" 2053s # with xfail, would sometimes raise UnicodeDecodeError 2053s # invalid state byte 2053s ) 2053s def test_complibs(tmp_path, lvl, lib, request): 2053s # GH14478 2053s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2053s request.applymarker( 2053s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2053s ) 2053s df = DataFrame( 2053s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2053s ) 2053s 2053s # Remove lzo if its not available on this platform 2053s if not tables.which_lib_version("lzo"): 2053s pytest.skip("lzo not available") 2053s # Remove bzip2 if its not available on this platform 2053s if not tables.which_lib_version("bzip2"): 2053s pytest.skip("bzip2 not available") 2053s 2053s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2053s gname = f"{lvl}_{lib}" 2053s 2053s # Write and read file to see if data is consistent 2053s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2053s result = read_hdf(tmpfile, gname) 2053s tm.assert_frame_equal(result, df) 2053s 2053s # Open file and check metadata for correct amount of compression 2053s with tables.open_file(tmpfile, mode="r") as h5table: 2053s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2053s assert node.filters.complevel == lvl 2053s if lvl == 0: 2053s assert node.filters.complib is None 2053s else: 2053s > assert node.filters.complib == lib 2053s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2053s E 2053s E - blosc2 2053s E + blosc2:blosclz 2053s 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2053s ______________________ test_append_frame_column_oriented _______________________ 2053s self = 2053s node = , kwargs = {'side': 'right'} 2053s value = 2053s slobj = slice(0, 4, None) 2053s 2053s def visit_Subscript(self, node, **kwargs) -> ops.Term: 2053s # only allow simple subscripts 2053s 2053s value = self.visit(node.value) 2053s slobj = self.visit(node.slice) 2053s try: 2053s value = value.value 2053s except AttributeError: 2053s pass 2053s 2053s if isinstance(slobj, Term): 2053s # In py39 np.ndarray lookups with Term containing int raise 2053s slobj = slobj.value 2053s 2053s try: 2053s > return self.const_type(value[slobj], self.env) 2053s E TypeError: 'builtin_function_or_method' object is not subscriptable 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/pytables.py:468: TypeError 2053s 2053s The above exception was the direct cause of the following exception: 2053s 2053s setup_path = 'tmp.__2f424fc8-ea17-47ce-8d8e-0a451601bdc7__.h5' 2053s 2053s @pytest.mark.xfail(condition=PY312 or is_crashing_arch, reason="https://bugs.debian.org/1055801 and https://bugs.debian.org/790925",raises=ValueError,strict=False, run=not is_crashing_arch) 2053s def test_append_frame_column_oriented(setup_path): 2053s with ensure_clean_store(setup_path) as store: 2053s # column oriented 2053s df = DataFrame( 2053s np.random.default_rng(2).standard_normal((10, 4)), 2053s columns=Index(list("ABCD"), dtype=object), 2053s index=date_range("2000-01-01", periods=10, freq="B"), 2053s ) 2053s df.index = df.index._with_freq(None) # freq doesn't round-trip 2053s 2053s _maybe_remove(store, "df1") 2053s store.append("df1", df.iloc[:, :2], axes=["columns"]) 2053s store.append("df1", df.iloc[:, 2:]) 2053s tm.assert_frame_equal(store["df1"], df) 2053s 2053s result = store.select("df1", "columns=A") 2053s expected = df.reindex(columns=["A"]) 2053s tm.assert_frame_equal(expected, result) 2053s 2053s # selection on the non-indexable 2053s > result = store.select("df1", ("columns=A", "index=df.index[0:4]")) 2053s 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_append.py:311: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s File path: /tmp/tmp8roilk1l/tmp.__2f424fc8-ea17-47ce-8d8e-0a451601bdc7__.h5 2053s 2053s key = 'df1', where = [columns=A, index=df.index[0:4]], start = None, stop = None 2053s columns = None, iterator = False, chunksize = None, auto_close = False 2053s 2053s def select( 2053s self, 2053s key: str, 2053s where=None, 2053s start=None, 2053s stop=None, 2053s columns=None, 2053s iterator: bool = False, 2053s chunksize: int | None = None, 2053s auto_close: bool = False, 2053s ): 2053s """ 2053s Retrieve pandas object stored in file, optionally based on where criteria. 2053s 2053s .. warning:: 2053s 2053s Pandas uses PyTables for reading and writing HDF5 files, which allows 2053s serializing object-dtype data with pickle when using the "fixed" format. 2053s Loading pickled data received from untrusted sources can be unsafe. 2053s 2053s See: https://docs.python.org/3/library/pickle.html for more. 2053s 2053s Parameters 2053s ---------- 2053s key : str 2053s Object being retrieved from file. 2053s where : list or None 2053s List of Term (or convertible) objects, optional. 2053s start : int or None 2053s Row number to start selection. 2053s stop : int, default None 2053s Row number to stop selection. 2053s columns : list or None 2053s A list of columns that if not None, will limit the return columns. 2053s iterator : bool or False 2053s Returns an iterator. 2053s chunksize : int or None 2053s Number or rows to include in iteration, return an iterator. 2053s auto_close : bool or False 2053s Should automatically close the store when finished. 2053s 2053s Returns 2053s ------- 2053s object 2053s Retrieved object from file. 2053s 2053s Examples 2053s -------- 2053s >>> df = pd.DataFrame([[1, 2], [3, 4]], columns=['A', 'B']) 2053s >>> store = pd.HDFStore("store.h5", 'w') # doctest: +SKIP 2053s >>> store.put('data', df) # doctest: +SKIP 2053s >>> store.get('data') # doctest: +SKIP 2053s >>> print(store.keys()) # doctest: +SKIP 2053s ['/data1', '/data2'] 2053s >>> store.select('/data1') # doctest: +SKIP 2053s A B 2053s 0 1 2 2053s 1 3 4 2053s >>> store.select('/data1', where='columns == A') # doctest: +SKIP 2053s A 2053s 0 1 2053s 1 3 2053s >>> store.close() # doctest: +SKIP 2053s """ 2053s group = self.get_node(key) 2053s if group is None: 2053s raise KeyError(f"No object named {key} in the file") 2053s 2053s # create the storer and axes 2053s where = _ensure_term(where, scope_level=1) 2053s s = self._create_storer(group) 2053s s.infer_axes() 2053s 2053s # function to call on iteration 2053s def func(_start, _stop, _where): 2053s return s.read(start=_start, stop=_stop, where=_where, columns=columns) 2053s 2053s # create the iterator 2053s it = TableIterator( 2053s self, 2053s s, 2053s func, 2053s where=where, 2053s nrows=s.nrows, 2053s start=start, 2053s stop=stop, 2053s iterator=iterator, 2053s chunksize=chunksize, 2053s auto_close=auto_close, 2053s ) 2053s 2053s > return it.get_result() 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:915: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s coordinates = False 2053s 2053s def get_result(self, coordinates: bool = False): 2053s # return the actual iterator 2053s if self.chunksize is not None: 2053s if not isinstance(self.s, Table): 2053s raise TypeError("can only use an iterator or chunksize on a table") 2053s 2053s self.coordinates = self.s.read_coordinates(where=self.where) 2053s 2053s return self 2053s 2053s # if specified read via coordinates (necessary for multiple selections 2053s if coordinates: 2053s if not isinstance(self.s, Table): 2053s raise TypeError("can only read_coordinates on a table") 2053s where = self.s.read_coordinates( 2053s where=self.where, start=self.start, stop=self.stop 2053s ) 2053s else: 2053s where = self.where 2053s 2053s # directly return the result 2053s > results = self.func(self.start, self.stop, where) 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:2038: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s _start = 0, _stop = 4, _where = [columns=A, index=df.index[0:4]] 2053s 2053s def func(_start, _stop, _where): 2053s > return s.read(start=_start, stop=_stop, where=_where, columns=columns) 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:899: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = <[ClosedNodeError('the node object is closed') raised in repr()] AppendableFrameTable object at 0x7d565e0116a0> 2053s where = [columns=A, index=df.index[0:4]], columns = None, start = 0, stop = 4 2053s 2053s def read( 2053s self, 2053s where=None, 2053s columns=None, 2053s start: int | None = None, 2053s stop: int | None = None, 2053s ): 2053s # validate the version 2053s self.validate_version(where) 2053s 2053s # infer the data kind 2053s if not self.infer_axes(): 2053s return None 2053s 2053s > result = self._read_axes(where=where, start=start, stop=stop) 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:4640: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = <[ClosedNodeError('the node object is closed') raised in repr()] AppendableFrameTable object at 0x7d565e0116a0> 2053s where = [columns=A, index=df.index[0:4]], start = 0, stop = 4 2053s 2053s def _read_axes( 2053s self, where, start: int | None = None, stop: int | None = None 2053s ) -> list[tuple[np.ndarray, np.ndarray] | tuple[Index, Index]]: 2053s """ 2053s Create the axes sniffed from the table. 2053s 2053s Parameters 2053s ---------- 2053s where : ??? 2053s start : int or None, default None 2053s stop : int or None, default None 2053s 2053s Returns 2053s ------- 2053s List[Tuple[index_values, column_values]] 2053s """ 2053s # create the selection 2053s > selection = Selection(self, where=where, start=start, stop=stop) 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:3826: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s table = <[ClosedNodeError('the node object is closed') raised in repr()] AppendableFrameTable object at 0x7d565e0116a0> 2053s where = [columns=A, index=df.index[0:4]], start = 0, stop = 4 2053s 2053s def __init__( 2053s self, 2053s table: Table, 2053s where=None, 2053s start: int | None = None, 2053s stop: int | None = None, 2053s ) -> None: 2053s self.table = table 2053s self.where = where 2053s self.start = start 2053s self.stop = stop 2053s self.condition = None 2053s self.filter = None 2053s self.terms = None 2053s self.coordinates = None 2053s 2053s if is_list_like(where): 2053s # see if we have a passed coordinate like 2053s with suppress(ValueError): 2053s inferred = lib.infer_dtype(where, skipna=False) 2053s if inferred in ("integer", "boolean"): 2053s where = np.asarray(where) 2053s if where.dtype == np.bool_: 2053s start, stop = self.start, self.stop 2053s if start is None: 2053s start = 0 2053s if stop is None: 2053s stop = self.table.nrows 2053s self.coordinates = np.arange(start, stop)[where] 2053s elif issubclass(where.dtype.type, np.integer): 2053s if (self.start is not None and (where < self.start).any()) or ( 2053s self.stop is not None and (where >= self.stop).any() 2053s ): 2053s raise ValueError( 2053s "where must have index locations >= start and < stop" 2053s ) 2053s self.coordinates = where 2053s 2053s if self.coordinates is None: 2053s > self.terms = self.generate(where) 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:5367: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s where = [columns=A, index=df.index[0:4]] 2053s 2053s def generate(self, where): 2053s """where can be a : dict,list,tuple,string""" 2053s if where is None: 2053s return None 2053s 2053s q = self.table.queryables() 2053s try: 2053s > return PyTablesExpr(where, queryables=q, encoding=self.table.encoding) 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:5380: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = (columns=A) & (index=df.index[0:4]) 2053s where = [columns=A, index=df.index[0:4]] 2053s queryables = {'columns': name->columns,cname->columns,axis->1,pos->0,kind->string, 'index': None} 2053s encoding = 'UTF-8', scope_level = 0 2053s 2053s def __init__( 2053s self, 2053s where, 2053s queryables: dict[str, Any] | None = None, 2053s encoding=None, 2053s scope_level: int = 0, 2053s ) -> None: 2053s where = _validate_where(where) 2053s 2053s self.encoding = encoding 2053s self.condition = None 2053s self.filter = None 2053s self.terms = None 2053s self._visitor = None 2053s 2053s # capture the environment if needed 2053s local_dict: _scope.DeepChainMap[Any, Any] | None = None 2053s 2053s if isinstance(where, PyTablesExpr): 2053s local_dict = where.env.scope 2053s _where = where.expr 2053s 2053s elif is_list_like(where): 2053s where = list(where) 2053s for idx, w in enumerate(where): 2053s if isinstance(w, PyTablesExpr): 2053s local_dict = w.env.scope 2053s else: 2053s where[idx] = _validate_where(w) 2053s _where = " & ".join([f"({w})" for w in com.flatten(where)]) 2053s else: 2053s # _validate_where ensures we otherwise have a string 2053s _where = where 2053s 2053s self.expr = _where 2053s self.env = PyTablesScope(scope_level + 1, local_dict=local_dict) 2053s 2053s if queryables is not None and isinstance(self.expr, str): 2053s self.env.queryables.update(queryables) 2053s self._visitor = PyTablesExprVisitor( 2053s self.env, 2053s queryables=queryables, 2053s parser="pytables", 2053s engine="pytables", 2053s encoding=encoding, 2053s ) 2053s > self.terms = self.parse() 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/pytables.py:610: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = (columns=A) & (index=df.index[0:4]) 2053s 2053s def parse(self): 2053s """ 2053s Parse an expression. 2053s """ 2053s > return self._visitor.visit(self.expr) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:824: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s clean = '(columns ==A )and (index ==df .index [0 :4 ])', method = 'visit_Module' 2053s visitor = > 2053s 2053s def visit(self, node, **kwargs): 2053s if isinstance(node, str): 2053s clean = self.preparser(node) 2053s try: 2053s node = ast.fix_missing_locations(ast.parse(clean)) 2053s except SyntaxError as e: 2053s if any(iskeyword(x) for x in clean.split()): 2053s e.msg = "Python keyword not valid identifier in numexpr query" 2053s raise e 2053s 2053s method = f"visit_{type(node).__name__}" 2053s visitor = getattr(self, method) 2053s > return visitor(node, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s expr = 2053s 2053s def visit_Module(self, node, **kwargs): 2053s if len(node.body) != 1: 2053s raise SyntaxError("only a single expression is allowed") 2053s expr = node.body[0] 2053s > return self.visit(expr, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:417: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {}, method = 'visit_Expr' 2053s visitor = > 2053s 2053s def visit(self, node, **kwargs): 2053s if isinstance(node, str): 2053s clean = self.preparser(node) 2053s try: 2053s node = ast.fix_missing_locations(ast.parse(clean)) 2053s except SyntaxError as e: 2053s if any(iskeyword(x) for x in clean.split()): 2053s e.msg = "Python keyword not valid identifier in numexpr query" 2053s raise e 2053s 2053s method = f"visit_{type(node).__name__}" 2053s visitor = getattr(self, method) 2053s > return visitor(node, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s 2053s def visit_Expr(self, node, **kwargs): 2053s > return self.visit(node.value, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:420: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s method = 'visit_BoolOp' 2053s visitor = > 2053s 2053s def visit(self, node, **kwargs): 2053s if isinstance(node, str): 2053s clean = self.preparser(node) 2053s try: 2053s node = ast.fix_missing_locations(ast.parse(clean)) 2053s except SyntaxError as e: 2053s if any(iskeyword(x) for x in clean.split()): 2053s e.msg = "Python keyword not valid identifier in numexpr query" 2053s raise e 2053s 2053s method = f"visit_{type(node).__name__}" 2053s visitor = getattr(self, method) 2053s > return visitor(node, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s visitor = .visitor at 0x7d565e11c4a0> 2053s operands = [, ] 2053s 2053s def visit_BoolOp(self, node, **kwargs): 2053s def visitor(x, y): 2053s lhs = self._try_visit_binop(x) 2053s rhs = self._try_visit_binop(y) 2053s 2053s op, op_class, lhs, rhs = self._maybe_transform_eq_ne(node, lhs, rhs) 2053s return self._maybe_evaluate_binop(op, node.op, lhs, rhs) 2053s 2053s operands = node.values 2053s > return reduce(visitor, operands) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:742: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s x = 2053s y = 2053s 2053s def visitor(x, y): 2053s lhs = self._try_visit_binop(x) 2053s > rhs = self._try_visit_binop(y) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:736: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s bop = 2053s 2053s def _try_visit_binop(self, bop): 2053s if isinstance(bop, (Op, Term)): 2053s return bop 2053s > return self.visit(bop) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:731: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s method = 'visit_Compare' 2053s visitor = > 2053s 2053s def visit(self, node, **kwargs): 2053s if isinstance(node, str): 2053s clean = self.preparser(node) 2053s try: 2053s node = ast.fix_missing_locations(ast.parse(clean)) 2053s except SyntaxError as e: 2053s if any(iskeyword(x) for x in clean.split()): 2053s e.msg = "Python keyword not valid identifier in numexpr query" 2053s raise e 2053s 2053s method = f"visit_{type(node).__name__}" 2053s visitor = getattr(self, method) 2053s > return visitor(node, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s ops = [] 2053s comps = [] 2053s op = 2053s binop = 2053s 2053s def visit_Compare(self, node, **kwargs): 2053s ops = node.ops 2053s comps = node.comparators 2053s 2053s # base case: we have something like a CMP b 2053s if len(comps) == 1: 2053s op = self.translate_In(ops[0]) 2053s binop = ast.BinOp(op=op, left=node.left, right=comps[0]) 2053s > return self.visit(binop) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:715: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {}, method = 'visit_BinOp' 2053s visitor = > 2053s 2053s def visit(self, node, **kwargs): 2053s if isinstance(node, str): 2053s clean = self.preparser(node) 2053s try: 2053s node = ast.fix_missing_locations(ast.parse(clean)) 2053s except SyntaxError as e: 2053s if any(iskeyword(x) for x in clean.split()): 2053s e.msg = "Python keyword not valid identifier in numexpr query" 2053s raise e 2053s 2053s method = f"visit_{type(node).__name__}" 2053s visitor = getattr(self, method) 2053s > return visitor(node, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s 2053s def visit_BinOp(self, node, **kwargs): 2053s > op, op_class, left, right = self._maybe_transform_eq_ne(node) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:531: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , left = index, right = None 2053s 2053s def _maybe_transform_eq_ne(self, node, left=None, right=None): 2053s if left is None: 2053s left = self.visit(node.left, side="left") 2053s if right is None: 2053s > right = self.visit(node.right, side="right") 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:453: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {'side': 'right'} 2053s method = 'visit_Subscript' 2053s visitor = > 2053s 2053s def visit(self, node, **kwargs): 2053s if isinstance(node, str): 2053s clean = self.preparser(node) 2053s try: 2053s node = ast.fix_missing_locations(ast.parse(clean)) 2053s except SyntaxError as e: 2053s if any(iskeyword(x) for x in clean.split()): 2053s e.msg = "Python keyword not valid identifier in numexpr query" 2053s raise e 2053s 2053s method = f"visit_{type(node).__name__}" 2053s visitor = getattr(self, method) 2053s > return visitor(node, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {'side': 'right'} 2053s value = 2053s slobj = slice(0, 4, None) 2053s 2053s def visit_Subscript(self, node, **kwargs) -> ops.Term: 2053s # only allow simple subscripts 2053s 2053s value = self.visit(node.value) 2053s slobj = self.visit(node.slice) 2053s try: 2053s value = value.value 2053s except AttributeError: 2053s pass 2053s 2053s if isinstance(slobj, Term): 2053s # In py39 np.ndarray lookups with Term containing int raise 2053s slobj = slobj.value 2053s 2053s try: 2053s return self.const_type(value[slobj], self.env) 2053s except TypeError as err: 2053s > raise ValueError( 2053s f"cannot subscript {repr(value)} with {repr(slobj)}" 2053s ) from err 2053s E ValueError: cannot subscript with slice(0, 4, None) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/pytables.py:470: ValueError 2053s __________________________ test_select_filter_corner ___________________________ 2053s setup_path = 'tmp.__ab2fca04-0164-4822-8164-7ed8156b14ef__.h5' 2053s 2053s @pytest.mark.xfail(condition=PY312 or is_crashing_arch, reason="https://bugs.debian.org/1055801 and https://bugs.debian.org/790925",raises=ValueError,strict=False, run=not is_crashing_arch) 2053s def test_select_filter_corner(setup_path): 2053s df = DataFrame(np.random.default_rng(2).standard_normal((50, 100))) 2053s df.index = [f"{c:3d}" for c in df.index] 2053s df.columns = [f"{c:3d}" for c in df.columns] 2053s 2053s with ensure_clean_store(setup_path) as store: 2053s store.put("frame", df, format="table") 2053s 2053s crit = "columns=df.columns[:75]" 2053s > result = store.select("frame", [crit]) 2053s 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_store.py:899: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s File path: /tmp/tmp7nyd93l4/tmp.__ab2fca04-0164-4822-8164-7ed8156b14ef__.h5 2053s 2053s key = 'frame', where = [columns=df.columns[:75]], start = None, stop = None 2053s columns = None, iterator = False, chunksize = None, auto_close = False 2053s 2053s def select( 2053s self, 2053s key: str, 2053s where=None, 2053s start=None, 2053s stop=None, 2053s columns=None, 2053s iterator: bool = False, 2053s chunksize: int | None = None, 2053s auto_close: bool = False, 2053s ): 2053s """ 2053s Retrieve pandas object stored in file, optionally based on where criteria. 2053s 2053s .. warning:: 2053s 2053s Pandas uses PyTables for reading and writing HDF5 files, which allows 2053s serializing object-dtype data with pickle when using the "fixed" format. 2053s Loading pickled data received from untrusted sources can be unsafe. 2053s 2053s See: https://docs.python.org/3/library/pickle.html for more. 2053s 2053s Parameters 2053s ---------- 2053s key : str 2053s Object being retrieved from file. 2053s where : list or None 2053s List of Term (or convertible) objects, optional. 2053s start : int or None 2053s Row number to start selection. 2053s stop : int, default None 2053s Row number to stop selection. 2053s columns : list or None 2053s A list of columns that if not None, will limit the return columns. 2053s iterator : bool or False 2053s Returns an iterator. 2053s chunksize : int or None 2053s Number or rows to include in iteration, return an iterator. 2053s auto_close : bool or False 2053s Should automatically close the store when finished. 2053s 2053s Returns 2053s ------- 2053s object 2053s Retrieved object from file. 2053s 2053s Examples 2053s -------- 2053s >>> df = pd.DataFrame([[1, 2], [3, 4]], columns=['A', 'B']) 2053s >>> store = pd.HDFStore("store.h5", 'w') # doctest: +SKIP 2053s >>> store.put('data', df) # doctest: +SKIP 2053s >>> store.get('data') # doctest: +SKIP 2053s >>> print(store.keys()) # doctest: +SKIP 2053s ['/data1', '/data2'] 2053s >>> store.select('/data1') # doctest: +SKIP 2053s A B 2053s 0 1 2 2053s 1 3 4 2053s >>> store.select('/data1', where='columns == A') # doctest: +SKIP 2053s A 2053s 0 1 2053s 1 3 2053s >>> store.close() # doctest: +SKIP 2053s """ 2053s group = self.get_node(key) 2053s if group is None: 2053s raise KeyError(f"No object named {key} in the file") 2053s 2053s # create the storer and axes 2053s where = _ensure_term(where, scope_level=1) 2053s s = self._create_storer(group) 2053s s.infer_axes() 2053s 2053s # function to call on iteration 2053s def func(_start, _stop, _where): 2053s return s.read(start=_start, stop=_stop, where=_where, columns=columns) 2053s 2053s # create the iterator 2053s it = TableIterator( 2053s self, 2053s s, 2053s func, 2053s where=where, 2053s nrows=s.nrows, 2053s start=start, 2053s stop=stop, 2053s iterator=iterator, 2053s chunksize=chunksize, 2053s auto_close=auto_close, 2053s ) 2053s 2053s > return it.get_result() 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:915: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s coordinates = False 2053s 2053s def get_result(self, coordinates: bool = False): 2053s # return the actual iterator 2053s if self.chunksize is not None: 2053s if not isinstance(self.s, Table): 2053s raise TypeError("can only use an iterator or chunksize on a table") 2053s 2053s self.coordinates = self.s.read_coordinates(where=self.where) 2053s 2053s return self 2053s 2053s # if specified read via coordinates (necessary for multiple selections 2053s if coordinates: 2053s if not isinstance(self.s, Table): 2053s raise TypeError("can only read_coordinates on a table") 2053s where = self.s.read_coordinates( 2053s where=self.where, start=self.start, stop=self.stop 2053s ) 2053s else: 2053s where = self.where 2053s 2053s # directly return the result 2053s > results = self.func(self.start, self.stop, where) 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:2038: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s _start = 0, _stop = 50, _where = [columns=df.columns[:75]] 2053s 2053s def func(_start, _stop, _where): 2053s > return s.read(start=_start, stop=_stop, where=_where, columns=columns) 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:899: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = <[ClosedNodeError('the node object is closed') raised in repr()] AppendableFrameTable object at 0x7d565e00d1d0> 2053s where = [columns=df.columns[:75]], columns = None, start = 0, stop = 50 2053s 2053s def read( 2053s self, 2053s where=None, 2053s columns=None, 2053s start: int | None = None, 2053s stop: int | None = None, 2053s ): 2053s # validate the version 2053s self.validate_version(where) 2053s 2053s # infer the data kind 2053s if not self.infer_axes(): 2053s return None 2053s 2053s > result = self._read_axes(where=where, start=start, stop=stop) 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:4640: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = <[ClosedNodeError('the node object is closed') raised in repr()] AppendableFrameTable object at 0x7d565e00d1d0> 2053s where = [columns=df.columns[:75]], start = 0, stop = 50 2053s 2053s def _read_axes( 2053s self, where, start: int | None = None, stop: int | None = None 2053s ) -> list[tuple[np.ndarray, np.ndarray] | tuple[Index, Index]]: 2053s """ 2053s Create the axes sniffed from the table. 2053s 2053s Parameters 2053s ---------- 2053s where : ??? 2053s start : int or None, default None 2053s stop : int or None, default None 2053s 2053s Returns 2053s ------- 2053s List[Tuple[index_values, column_values]] 2053s """ 2053s # create the selection 2053s > selection = Selection(self, where=where, start=start, stop=stop) 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:3826: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s table = <[ClosedNodeError('the node object is closed') raised in repr()] AppendableFrameTable object at 0x7d565e00d1d0> 2053s where = [columns=df.columns[:75]], start = 0, stop = 50 2053s 2053s def __init__( 2053s self, 2053s table: Table, 2053s where=None, 2053s start: int | None = None, 2053s stop: int | None = None, 2053s ) -> None: 2053s self.table = table 2053s self.where = where 2053s self.start = start 2053s self.stop = stop 2053s self.condition = None 2053s self.filter = None 2053s self.terms = None 2053s self.coordinates = None 2053s 2053s if is_list_like(where): 2053s # see if we have a passed coordinate like 2053s with suppress(ValueError): 2053s inferred = lib.infer_dtype(where, skipna=False) 2053s if inferred in ("integer", "boolean"): 2053s where = np.asarray(where) 2053s if where.dtype == np.bool_: 2053s start, stop = self.start, self.stop 2053s if start is None: 2053s start = 0 2053s if stop is None: 2053s stop = self.table.nrows 2053s self.coordinates = np.arange(start, stop)[where] 2053s elif issubclass(where.dtype.type, np.integer): 2053s if (self.start is not None and (where < self.start).any()) or ( 2053s self.stop is not None and (where >= self.stop).any() 2053s ): 2053s raise ValueError( 2053s "where must have index locations >= start and < stop" 2053s ) 2053s self.coordinates = where 2053s 2053s if self.coordinates is None: 2053s > self.terms = self.generate(where) 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:5367: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s where = [columns=df.columns[:75]] 2053s 2053s def generate(self, where): 2053s """where can be a : dict,list,tuple,string""" 2053s if where is None: 2053s return None 2053s 2053s q = self.table.queryables() 2053s try: 2053s > return PyTablesExpr(where, queryables=q, encoding=self.table.encoding) 2053s 2053s /usr/lib/python3/dist-packages/pandas/io/pytables.py:5380: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = (columns=df.columns[:75]), where = [columns=df.columns[:75]] 2053s queryables = {'columns': None, 'index': name->index,cname->index,axis->0,pos->0,kind->string} 2053s encoding = 'UTF-8', scope_level = 0 2053s 2053s def __init__( 2053s self, 2053s where, 2053s queryables: dict[str, Any] | None = None, 2053s encoding=None, 2053s scope_level: int = 0, 2053s ) -> None: 2053s where = _validate_where(where) 2053s 2053s self.encoding = encoding 2053s self.condition = None 2053s self.filter = None 2053s self.terms = None 2053s self._visitor = None 2053s 2053s # capture the environment if needed 2053s local_dict: _scope.DeepChainMap[Any, Any] | None = None 2053s 2053s if isinstance(where, PyTablesExpr): 2053s local_dict = where.env.scope 2053s _where = where.expr 2053s 2053s elif is_list_like(where): 2053s where = list(where) 2053s for idx, w in enumerate(where): 2053s if isinstance(w, PyTablesExpr): 2053s local_dict = w.env.scope 2053s else: 2053s where[idx] = _validate_where(w) 2053s _where = " & ".join([f"({w})" for w in com.flatten(where)]) 2053s else: 2053s # _validate_where ensures we otherwise have a string 2053s _where = where 2053s 2053s self.expr = _where 2053s self.env = PyTablesScope(scope_level + 1, local_dict=local_dict) 2053s 2053s if queryables is not None and isinstance(self.expr, str): 2053s self.env.queryables.update(queryables) 2053s self._visitor = PyTablesExprVisitor( 2053s self.env, 2053s queryables=queryables, 2053s parser="pytables", 2053s engine="pytables", 2053s encoding=encoding, 2053s ) 2053s > self.terms = self.parse() 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/pytables.py:610: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = (columns=df.columns[:75]) 2053s 2053s def parse(self): 2053s """ 2053s Parse an expression. 2053s """ 2053s > return self._visitor.visit(self.expr) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:824: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s clean = '(columns ==df .columns [:75 ])', method = 'visit_Module' 2053s visitor = > 2053s 2053s def visit(self, node, **kwargs): 2053s if isinstance(node, str): 2053s clean = self.preparser(node) 2053s try: 2053s node = ast.fix_missing_locations(ast.parse(clean)) 2053s except SyntaxError as e: 2053s if any(iskeyword(x) for x in clean.split()): 2053s e.msg = "Python keyword not valid identifier in numexpr query" 2053s raise e 2053s 2053s method = f"visit_{type(node).__name__}" 2053s visitor = getattr(self, method) 2053s > return visitor(node, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s expr = 2053s 2053s def visit_Module(self, node, **kwargs): 2053s if len(node.body) != 1: 2053s raise SyntaxError("only a single expression is allowed") 2053s expr = node.body[0] 2053s > return self.visit(expr, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:417: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {}, method = 'visit_Expr' 2053s visitor = > 2053s 2053s def visit(self, node, **kwargs): 2053s if isinstance(node, str): 2053s clean = self.preparser(node) 2053s try: 2053s node = ast.fix_missing_locations(ast.parse(clean)) 2053s except SyntaxError as e: 2053s if any(iskeyword(x) for x in clean.split()): 2053s e.msg = "Python keyword not valid identifier in numexpr query" 2053s raise e 2053s 2053s method = f"visit_{type(node).__name__}" 2053s visitor = getattr(self, method) 2053s > return visitor(node, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s 2053s def visit_Expr(self, node, **kwargs): 2053s > return self.visit(node.value, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:420: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s method = 'visit_Compare' 2053s visitor = > 2053s 2053s def visit(self, node, **kwargs): 2053s if isinstance(node, str): 2053s clean = self.preparser(node) 2053s try: 2053s node = ast.fix_missing_locations(ast.parse(clean)) 2053s except SyntaxError as e: 2053s if any(iskeyword(x) for x in clean.split()): 2053s e.msg = "Python keyword not valid identifier in numexpr query" 2053s raise e 2053s 2053s method = f"visit_{type(node).__name__}" 2053s visitor = getattr(self, method) 2053s > return visitor(node, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s ops = [] 2053s comps = [] 2053s op = 2053s binop = 2053s 2053s def visit_Compare(self, node, **kwargs): 2053s ops = node.ops 2053s comps = node.comparators 2053s 2053s # base case: we have something like a CMP b 2053s if len(comps) == 1: 2053s op = self.translate_In(ops[0]) 2053s binop = ast.BinOp(op=op, left=node.left, right=comps[0]) 2053s > return self.visit(binop) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:715: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {}, method = 'visit_BinOp' 2053s visitor = > 2053s 2053s def visit(self, node, **kwargs): 2053s if isinstance(node, str): 2053s clean = self.preparser(node) 2053s try: 2053s node = ast.fix_missing_locations(ast.parse(clean)) 2053s except SyntaxError as e: 2053s if any(iskeyword(x) for x in clean.split()): 2053s e.msg = "Python keyword not valid identifier in numexpr query" 2053s raise e 2053s 2053s method = f"visit_{type(node).__name__}" 2053s visitor = getattr(self, method) 2053s > return visitor(node, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s 2053s def visit_BinOp(self, node, **kwargs): 2053s > op, op_class, left, right = self._maybe_transform_eq_ne(node) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:531: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , left = columns, right = None 2053s 2053s def _maybe_transform_eq_ne(self, node, left=None, right=None): 2053s if left is None: 2053s left = self.visit(node.left, side="left") 2053s if right is None: 2053s > right = self.visit(node.right, side="right") 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:453: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {'side': 'right'} 2053s method = 'visit_Subscript' 2053s visitor = > 2053s 2053s def visit(self, node, **kwargs): 2053s if isinstance(node, str): 2053s clean = self.preparser(node) 2053s try: 2053s node = ast.fix_missing_locations(ast.parse(clean)) 2053s except SyntaxError as e: 2053s if any(iskeyword(x) for x in clean.split()): 2053s e.msg = "Python keyword not valid identifier in numexpr query" 2053s raise e 2053s 2053s method = f"visit_{type(node).__name__}" 2053s visitor = getattr(self, method) 2053s > return visitor(node, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {'side': 'right'} 2053s 2053s def visit_Subscript(self, node, **kwargs) -> ops.Term: 2053s # only allow simple subscripts 2053s 2053s > value = self.visit(node.value) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/pytables.py:456: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {} 2053s method = 'visit_Attribute' 2053s visitor = > 2053s 2053s def visit(self, node, **kwargs): 2053s if isinstance(node, str): 2053s clean = self.preparser(node) 2053s try: 2053s node = ast.fix_missing_locations(ast.parse(clean)) 2053s except SyntaxError as e: 2053s if any(iskeyword(x) for x in clean.split()): 2053s e.msg = "Python keyword not valid identifier in numexpr query" 2053s raise e 2053s 2053s method = f"visit_{type(node).__name__}" 2053s visitor = getattr(self, method) 2053s > return visitor(node, **kwargs) 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2053s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2053s 2053s self = 2053s node = , kwargs = {}, attr = 'columns' 2053s value = , ctx = 2053s resolved = 'df' 2053s 2053s def visit_Attribute(self, node, **kwargs): 2053s attr = node.attr 2053s value = node.value 2053s 2053s ctx = type(node.ctx) 2053s if ctx == ast.Load: 2053s # resolve the value 2053s resolved = self.visit(value) 2053s 2053s # try to get the value to see if we are another expression 2053s try: 2053s resolved = resolved.value 2053s except AttributeError: 2053s pass 2053s 2053s try: 2053s return self.term_type(getattr(resolved, attr), self.env) 2053s except AttributeError: 2053s # something like datetime.datetime where scope is overridden 2053s if isinstance(value, ast.Name) and value.id == attr: 2053s return resolved 2053s 2053s > raise ValueError(f"Invalid Attribute context {ctx.__name__}") 2053s E ValueError: Invalid Attribute context Load 2053s 2053s /usr/lib/python3/dist-packages/pandas/core/computation/pytables.py:496: ValueError 2053s =============================== warnings summary =============================== 2053s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:39 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:39: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2053s pytestmark = pytest.mark.single_cpu 2053s 2053s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_append.py:30 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_append.py:30: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2053s pytestmark = pytest.mark.single_cpu 2053s 2053s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_store.py:39 2053s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_store.py:39: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2053s pytestmark = pytest.mark.single_cpu 2053s 2053s test_file_handling.py: 191 warnings 2053s test_append.py: 21 warnings 2053s test_store.py: 66 warnings 2053s /usr/lib/python3/dist-packages/py/_process/forkedfunc.py:45: DeprecationWarning: This process (pid=19561) is multi-threaded, use of fork() may lead to deadlocks in the child. 2053s pid = os.fork() 2053s 2053s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 2053s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/tests/io/pytables/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/tests/io/pytables/pytest-cache-files-7gdvhdoj' 2053s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 2053s 2053s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:429 2053s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:429: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/tests/io/pytables/.pytest_cache/v/cache/lastfailed: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/tests/io/pytables/pytest-cache-files-_ndknrub' 2053s config.cache.set("cache/lastfailed", self.lastfailed) 2053s 2053s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 2053s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/tests/io/pytables/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/tests/io/pytables/pytest-cache-files-8at7x_n_' 2053s session.config.cache.set(STEPWISE_CACHE_DIR, []) 2053s 2053s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 2053s =========================== short test summary info ============================ 2053s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-1] 2053s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-2] 2053s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-3] 2053s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-4] 2053s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-5] 2053s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-6] 2053s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-7] 2053s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-8] 2053s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-9] 2053s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_append.py::test_append_frame_column_oriented 2053s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_store.py::test_select_filter_corner 2053s ================ 11 failed, 267 passed, 284 warnings in 14.98s ================= 2053s pymysql/psycopg2 tests, which do not work in this test environment 2056s ============================= test session starts ============================== 2056s platform linux -- Python 3.13.0, pytest-8.3.3, pluggy-1.5.0 2056s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 2056s rootdir: /usr/lib/python3/dist-packages/pandas/tests 2056s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 2056s asyncio: mode=Mode.STRICT 2056s collected 2513 items 2056s 2386s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py FFFEFE....ssFFFEFE....ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE...FFFEFE....ssFFFEFE...FFFEFE...FFFEFE...FFFEFE...xssFFFEFEFEFEFEFExxFEFExxxxssFFFEFE...FFFEFE....ssFFFEFE....ssxxxxxx....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE...xssFFFEFE...xssFFFEFE...xssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE...xssFFFEFE...xssFFFEFE...xssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssxxFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE...xssFFFEFE...xssFFFEFE...xss.FFFEFE...xssFFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FF.EFE....ssFF.EFE....ssFFFEFE...FFFEFE...FFFEFE...xss.ss...FFFEFE..sFFFEFE..sFFFEFE..s.ssFFFEFE....ssFFFEFE...FFFEFE...FFFEFE...xxFEFExxxFFFEFE...FFFEFExxxFEFEFEFEFEFEFEFEFFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE..xFFFEFE..sFFFEFE..sFFFEFE..sFFFEFE...FFFEFE...FFFEFE...FFFEFE...FFFEFE..sFFFEFE...xssFFFEFE....ssFFFEFE...FFFEFE....ss..FFFEFE....ssFFFEFE....ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssFFFEFExxxxssFFFEFExxxxssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ssFFFEFE....ss....s..FEFE................ 2408s ../../../usr/lib/python3/dist-packages/pandas/tests/tools/test_to_datetime.py ............................................................................ss....................................................................................................................................................ssssssss......................................................................................................................................................................................................................................................................................................................................................xx....ss.ssssss....................................................s........................................................................................................................................................................................................................................................................................................ssssssssss............................ 2408s 2408s ==================================== ERRORS ==================================== 2408s ____ ERROR at teardown of test_dataframe_to_sql[postgresql_psycopg2_engine] ____ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _____ ERROR at teardown of test_dataframe_to_sql[postgresql_psycopg2_conn] _____ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_dataframe_to_sql_empty[postgresql_psycopg2_engine] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s __ ERROR at teardown of test_dataframe_to_sql_empty[postgresql_psycopg2_conn] __ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s ______ ERROR at teardown of test_to_sql[None-postgresql_psycopg2_engine] _______ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _______ ERROR at teardown of test_to_sql[None-postgresql_psycopg2_conn] ________ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s ______ ERROR at teardown of test_to_sql[multi-postgresql_psycopg2_engine] ______ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _______ ERROR at teardown of test_to_sql[multi-postgresql_psycopg2_conn] _______ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_to_sql_exist[replace-1-postgresql_psycopg2_engine] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s __ ERROR at teardown of test_to_sql_exist[replace-1-postgresql_psycopg2_conn] __ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_to_sql_exist[append-2-postgresql_psycopg2_engine] __ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s __ ERROR at teardown of test_to_sql_exist[append-2-postgresql_psycopg2_conn] ___ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s ___ ERROR at teardown of test_to_sql_exist_fail[postgresql_psycopg2_engine] ____ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s ____ ERROR at teardown of test_to_sql_exist_fail[postgresql_psycopg2_conn] _____ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s __ ERROR at teardown of test_read_iris_query[postgresql_psycopg2_engine_iris] __ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s ___ ERROR at teardown of test_read_iris_query[postgresql_psycopg2_conn_iris] ___ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_read_iris_query_chunksize[postgresql_psycopg2_engine_iris] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_read_iris_query_chunksize[postgresql_psycopg2_conn_iris] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_read_iris_query_expression_with_parameter[postgresql_psycopg2_engine_iris] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_read_iris_query_expression_with_parameter[postgresql_psycopg2_conn_iris] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_read_iris_query_string_with_parameter[postgresql_psycopg2_engine_iris] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_read_iris_query_string_with_parameter[postgresql_psycopg2_conn_iris] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s __ ERROR at teardown of test_read_iris_table[postgresql_psycopg2_engine_iris] __ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s ___ ERROR at teardown of test_read_iris_table[postgresql_psycopg2_conn_iris] ___ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_read_iris_table_chunksize[postgresql_psycopg2_engine_iris] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_read_iris_table_chunksize[postgresql_psycopg2_conn_iris] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s ____ ERROR at teardown of test_to_sql_callable[postgresql_psycopg2_engine] _____ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _____ ERROR at teardown of test_to_sql_callable[postgresql_psycopg2_conn] ______ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_default_type_conversion[postgresql_psycopg2_engine_types] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_default_type_conversion[postgresql_psycopg2_conn_types] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_copy_from_callable_insertion_method[2-postgresql_psycopg2_engine] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_copy_from_callable_insertion_method[2-postgresql_psycopg2_conn] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_copy_from_callable_insertion_method[Success!-postgresql_psycopg2_engine] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_copy_from_callable_insertion_method[Success!-postgresql_psycopg2_conn] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_insertion_method_on_conflict_do_nothing[postgresql_psycopg2_engine] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E 2408s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s _ ERROR at teardown of test_insertion_method_on_conflict_do_nothing[postgresql_psycopg2_conn] _ 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s > self._dbapi_connection = engine.raw_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def raw_connection(self) -> PoolProxiedConnection: 2408s """Return a "raw" DBAPI connection from the connection pool. 2408s 2408s The returned object is a proxied version of the DBAPI 2408s connection object used by the underlying driver in use. 2408s The object will have all the same behavior as the real DBAPI 2408s connection, except that its ``close()`` method will result in the 2408s connection being returned to the pool, rather than being closed 2408s for real. 2408s 2408s This method provides direct DBAPI connection access for 2408s special situations when the API provided by 2408s :class:`_engine.Connection` 2408s is not needed. When a :class:`_engine.Connection` object is already 2408s present, the DBAPI connection is available using 2408s the :attr:`_engine.Connection.connection` accessor. 2408s 2408s .. seealso:: 2408s 2408s :ref:`dbapi_connections` 2408s 2408s """ 2408s > return self.pool.connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def connect(self) -> PoolProxiedConnection: 2408s """Return a DBAPI connection from the pool. 2408s 2408s The connection is instrumented such that when its 2408s ``close()`` method is called, the connection will be returned to 2408s the pool. 2408s 2408s """ 2408s > return _ConnectionFairy._checkout(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s threadconns = None, fairy = None 2408s 2408s @classmethod 2408s def _checkout( 2408s cls, 2408s pool: Pool, 2408s threadconns: Optional[threading.local] = None, 2408s fairy: Optional[_ConnectionFairy] = None, 2408s ) -> _ConnectionFairy: 2408s if not fairy: 2408s > fairy = _ConnectionRecord.checkout(pool) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s pool = 2408s 2408s @classmethod 2408s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2408s if TYPE_CHECKING: 2408s rec = cast(_ConnectionRecord, pool._do_get()) 2408s else: 2408s > rec = pool._do_get() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _do_get(self) -> ConnectionPoolEntry: 2408s > return self._create_connection() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def _create_connection(self) -> ConnectionPoolEntry: 2408s """Called by subclasses to create a new ConnectionRecord.""" 2408s 2408s > return _ConnectionRecord(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s pool = , connect = True 2408s 2408s def __init__(self, pool: Pool, connect: bool = True): 2408s self.fresh = False 2408s self.fairy_ref = None 2408s self.starttime = 0 2408s self.dbapi_connection = None 2408s 2408s self.__pool = pool 2408s if connect: 2408s > self.__connect() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s self.dbapi_connection = connection = pool._invoke_creator(self) 2408s pool.logger.debug("Created new connection %r", connection) 2408s self.fresh = True 2408s except BaseException as e: 2408s > with util.safe_reraise(): 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s type_ = None, value = None, traceback = None 2408s 2408s def __exit__( 2408s self, 2408s type_: Optional[Type[BaseException]], 2408s value: Optional[BaseException], 2408s traceback: Optional[types.TracebackType], 2408s ) -> NoReturn: 2408s assert self._exc_info is not None 2408s # see #2703 for notes 2408s if type_ is None: 2408s exc_type, exc_value, exc_tb = self._exc_info 2408s assert exc_value is not None 2408s self._exc_info = None # remove potential circular references 2408s > raise exc_value.with_traceback(exc_tb) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s 2408s def __connect(self) -> None: 2408s pool = self.__pool 2408s 2408s # ensure any existing connection is removed, so that if 2408s # creator fails, this attribute stays None 2408s self.dbapi_connection = None 2408s try: 2408s self.starttime = time.time() 2408s > self.dbapi_connection = connection = pool._invoke_creator(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s connection_record = 2408s 2408s def connect( 2408s connection_record: Optional[ConnectionPoolEntry] = None, 2408s ) -> DBAPIConnection: 2408s if dialect._has_events: 2408s for fn in dialect.dispatch.do_connect: 2408s connection = cast( 2408s DBAPIConnection, 2408s fn(dialect, connection_record, cargs, cparams), 2408s ) 2408s if connection is not None: 2408s return connection 2408s 2408s > return dialect.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s cargs = () 2408s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s 2408s def connect(self, *cargs, **cparams): 2408s # inherits the docstring from interfaces.Dialect.connect 2408s > return self.loaded_dbapi.connect(*cargs, **cparams) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2408s connection_factory = None, cursor_factory = None 2408s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2408s kwasync = {} 2408s 2408s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2408s """ 2408s Create a new database connection. 2408s 2408s The connection parameters can be specified as a string: 2408s 2408s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2408s 2408s or using a set of keyword arguments: 2408s 2408s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2408s 2408s Or as a mix of both. The basic connection parameters are: 2408s 2408s - *dbname*: the database name 2408s - *database*: the database name (only as keyword argument) 2408s - *user*: user name used to authenticate 2408s - *password*: password used to authenticate 2408s - *host*: database host address (defaults to UNIX socket if not provided) 2408s - *port*: connection port number (defaults to 5432 if not provided) 2408s 2408s Using the *connection_factory* parameter a different class or connections 2408s factory can be specified. It should be a callable object taking a dsn 2408s argument. 2408s 2408s Using the *cursor_factory* parameter, a new default cursor factory will be 2408s used by cursor(). 2408s 2408s Using *async*=True an asynchronous connection will be created. *async_* is 2408s a valid alias (for Python versions where ``async`` is a keyword). 2408s 2408s Any other keyword parameter will be passed to the underlying client 2408s library: the list of supported parameters depends on the library version. 2408s 2408s """ 2408s kwasync = {} 2408s if 'async' in kwargs: 2408s kwasync['async'] = kwargs.pop('async') 2408s if 'async_' in kwargs: 2408s kwasync['async_'] = kwargs.pop('async_') 2408s 2408s dsn = _ext.make_dsn(dsn, **kwargs) 2408s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2408s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2408s E Is the server running on that host and accepting TCP/IP connections? 2408s 2408s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2408s 2408s The above exception was the direct cause of the following exception: 2408s 2408s @pytest.fixture 2408s def postgresql_psycopg2_engine(): 2408s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2408s td.versioned_importorskip("psycopg2") 2408s engine = sqlalchemy.create_engine( 2408s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2408s poolclass=sqlalchemy.pool.NullPool, 2408s ) 2408s yield engine 2408s > for view in get_all_views(engine): 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def get_all_views(conn): 2408s if isinstance(conn, sqlite3.Connection): 2408s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2408s return [view[0] for view in c.fetchall()] 2408s else: 2408s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2408s if adbc and isinstance(conn, adbc.Connection): 2408s results = [] 2408s info = conn.adbc_get_objects().read_all().to_pylist() 2408s for catalog in info: 2408s catalog["catalog_name"] 2408s for schema in catalog["catalog_db_schemas"]: 2408s schema["db_schema_name"] 2408s for table in schema["db_schema_tables"]: 2408s if table["table_type"] == "view": 2408s view_name = table["table_name"] 2408s results.append(view_name) 2408s 2408s return results 2408s else: 2408s from sqlalchemy import inspect 2408s 2408s > return inspect(conn).get_view_names() 2408s 2408s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s raiseerr = True 2408s 2408s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2408s """Produce an inspection object for the given target. 2408s 2408s The returned value in some cases may be the 2408s same object as the one given, such as if a 2408s :class:`_orm.Mapper` object is passed. In other 2408s cases, it will be an instance of the registered 2408s inspection type for the given object, such as 2408s if an :class:`_engine.Engine` is passed, an 2408s :class:`_reflection.Inspector` object is returned. 2408s 2408s :param subject: the subject to be inspected. 2408s :param raiseerr: When ``True``, if the given subject 2408s does not 2408s correspond to a known SQLAlchemy inspected type, 2408s :class:`sqlalchemy.exc.NoInspectionAvailable` 2408s is raised. If ``False``, ``None`` is returned. 2408s 2408s """ 2408s type_ = type(subject) 2408s for cls in type_.__mro__: 2408s if cls in _registrars: 2408s reg = _registrars.get(cls, None) 2408s if reg is None: 2408s continue 2408s elif reg is True: 2408s return subject 2408s > ret = reg(subject) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @inspection._inspects(Engine) 2408s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2408s > return Inspector._construct(Inspector._init_engine, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s init = 2408s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s @classmethod 2408s def _construct( 2408s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2408s ) -> Inspector: 2408s if hasattr(bind.dialect, "inspector"): 2408s cls = bind.dialect.inspector 2408s 2408s self = cls.__new__(cls) 2408s > init(self, bind) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def _init_engine(self, engine: Engine) -> None: 2408s self.bind = self.engine = engine 2408s > engine.connect().close() 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s 2408s def connect(self) -> Connection: 2408s """Return a new :class:`_engine.Connection` object. 2408s 2408s The :class:`_engine.Connection` acts as a Python context manager, so 2408s the typical use of this method looks like:: 2408s 2408s with engine.connect() as connection: 2408s connection.execute(text("insert into table values ('foo')")) 2408s connection.commit() 2408s 2408s Where above, after the block is completed, the connection is "closed" 2408s and its underlying DBAPI resources are returned to the connection pool. 2408s This also has the effect of rolling back any transaction that 2408s was explicitly begun or was begun via autobegin, and will 2408s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2408s started and is still in progress. 2408s 2408s .. seealso:: 2408s 2408s :meth:`_engine.Engine.begin` 2408s 2408s """ 2408s 2408s > return self._connection_cls(self) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2408s _has_events: Optional[bool] = None, 2408s _allow_revalidate: bool = True, 2408s _allow_autobegin: bool = True, 2408s ): 2408s """Construct a new Connection.""" 2408s self.engine = engine 2408s self.dialect = dialect = engine.dialect 2408s 2408s if connection is None: 2408s try: 2408s self._dbapi_connection = engine.raw_connection() 2408s except dialect.loaded_dbapi.Error as err: 2408s > Connection._handle_dbapi_exception_noconnection( 2408s err, dialect, engine 2408s ) 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s cls = 2408s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2408s dialect = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2408s 2408s @classmethod 2408s def _handle_dbapi_exception_noconnection( 2408s cls, 2408s e: BaseException, 2408s dialect: Dialect, 2408s engine: Optional[Engine] = None, 2408s is_disconnect: Optional[bool] = None, 2408s invalidate_pool_on_disconnect: bool = True, 2408s is_pre_ping: bool = False, 2408s ) -> NoReturn: 2408s exc_info = sys.exc_info() 2408s 2408s if is_disconnect is None: 2408s is_disconnect = isinstance( 2408s e, dialect.loaded_dbapi.Error 2408s ) and dialect.is_disconnect(e, None, None) 2408s 2408s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2408s 2408s if should_wrap: 2408s sqlalchemy_exception = exc.DBAPIError.instance( 2408s None, 2408s None, 2408s cast(Exception, e), 2408s dialect.loaded_dbapi.Error, 2408s hide_parameters=( 2408s engine.hide_parameters if engine is not None else False 2408s ), 2408s connection_invalidated=is_disconnect, 2408s dialect=dialect, 2408s ) 2408s else: 2408s sqlalchemy_exception = None 2408s 2408s newraise = None 2408s 2408s if dialect._has_events: 2408s ctx = ExceptionContextImpl( 2408s e, 2408s sqlalchemy_exception, 2408s engine, 2408s dialect, 2408s None, 2408s None, 2408s None, 2408s None, 2408s None, 2408s is_disconnect, 2408s invalidate_pool_on_disconnect, 2408s is_pre_ping, 2408s ) 2408s for fn in dialect.dispatch.handle_error: 2408s try: 2408s # handler returns an exception; 2408s # call next handler in a chain 2408s per_fn = fn(ctx) 2408s if per_fn is not None: 2408s ctx.chained_exception = newraise = per_fn 2408s except Exception as _raised: 2408s # handler raises an exception - stop processing 2408s newraise = _raised 2408s break 2408s 2408s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2408s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2408s ctx.is_disconnect 2408s ) 2408s 2408s if newraise: 2408s raise newraise.with_traceback(exc_info[2]) from e 2408s elif should_wrap: 2408s assert sqlalchemy_exception is not None 2408s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2408s 2408s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2408s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2408s 2408s self = 2408s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2408s connection = None, _has_events = None, _allow_revalidate = True 2408s _allow_autobegin = True 2408s 2408s def __init__( 2408s self, 2408s engine: Engine, 2408s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_to_sql_on_public_schema[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_to_sql_on_public_schema[postgresql_psycopg2_conn] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ___ ERROR at teardown of test_read_view_postgres[postgresql_psycopg2_engine] ___ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ____ ERROR at teardown of test_read_view_postgres[postgresql_psycopg2_conn] ____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_read_sql_iris_parameter[postgresql_psycopg2_engine_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_read_sql_iris_parameter[postgresql_psycopg2_conn_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_read_sql_iris_named_parameter[postgresql_psycopg2_engine_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_read_sql_iris_named_parameter[postgresql_psycopg2_conn_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_read_sql_view[postgresql_psycopg2_engine_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __ ERROR at teardown of test_api_read_sql_view[postgresql_psycopg2_conn_iris] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_read_sql_with_chunksize_no_result[postgresql_psycopg2_engine_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_read_sql_with_chunksize_no_result[postgresql_psycopg2_conn_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _______ ERROR at teardown of test_api_to_sql[postgresql_psycopg2_engine] _______ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ________ ERROR at teardown of test_api_to_sql[postgresql_psycopg2_conn] ________ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ____ ERROR at teardown of test_api_to_sql_fail[postgresql_psycopg2_engine] _____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _____ ERROR at teardown of test_api_to_sql_fail[postgresql_psycopg2_conn] ______ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ___ ERROR at teardown of test_api_to_sql_replace[postgresql_psycopg2_engine] ___ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ____ ERROR at teardown of test_api_to_sql_replace[postgresql_psycopg2_conn] ____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ___ ERROR at teardown of test_api_to_sql_append[postgresql_psycopg2_engine] ____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ____ ERROR at teardown of test_api_to_sql_append[postgresql_psycopg2_conn] _____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_type_mapping[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_type_mapping[postgresql_psycopg2_conn] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ___ ERROR at teardown of test_api_to_sql_series[postgresql_psycopg2_engine] ____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ____ ERROR at teardown of test_api_to_sql_series[postgresql_psycopg2_conn] _____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _____ ERROR at teardown of test_api_roundtrip[postgresql_psycopg2_engine] ______ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ______ ERROR at teardown of test_api_roundtrip[postgresql_psycopg2_conn] _______ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_roundtrip_chunksize[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_roundtrip_chunksize[postgresql_psycopg2_conn] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __ ERROR at teardown of test_api_execute_sql[postgresql_psycopg2_engine_iris] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ___ ERROR at teardown of test_api_execute_sql[postgresql_psycopg2_conn_iris] ___ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_date_parsing[postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __ ERROR at teardown of test_api_date_parsing[postgresql_psycopg2_conn_types] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-ignore-postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-ignore-postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-raise-postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-raise-postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-coerce-postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-coerce-postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-ignore-postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-ignore-postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-raise-postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-raise-postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-coerce-postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-coerce-postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_date_and_index[postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_date_and_index[postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _____ ERROR at teardown of test_api_timedelta[postgresql_psycopg2_engine] ______ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ______ ERROR at teardown of test_api_timedelta[postgresql_psycopg2_conn] _______ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ___ ERROR at teardown of test_api_complex_raises[postgresql_psycopg2_engine] ___ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ____ ERROR at teardown of test_api_complex_raises[postgresql_psycopg2_conn] ____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label[None-None-index-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label[None-None-index-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label[None-other_label-other_label-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label[None-other_label-other_label-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label[index_name-None-index_name-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label[index_name-None-index_name-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label[index_name-other_label-other_label-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label[index_name-other_label-other_label-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label[0-None-0-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label[0-None-0-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label[None-0-0-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label[None-0-0-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label_multiindex[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_to_sql_index_label_multiindex[postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_multiindex_roundtrip[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_multiindex_roundtrip[postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_dtype_argument[None-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_dtype_argument[None-postgresql_psycopg2_conn] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_dtype_argument[int-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __ ERROR at teardown of test_api_dtype_argument[int-postgresql_psycopg2_conn] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_dtype_argument[float-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_dtype_argument[float-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_dtype_argument[dtype3-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_dtype_argument[dtype3-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_integer_col_names[postgresql_psycopg2_engine] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __ ERROR at teardown of test_api_integer_col_names[postgresql_psycopg2_conn] ___ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _____ ERROR at teardown of test_api_get_schema[postgresql_psycopg2_engine] _____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ______ ERROR at teardown of test_api_get_schema[postgresql_psycopg2_conn] ______ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_get_schema_with_schema[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_get_schema_with_schema[postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_get_schema_dtypes[postgresql_psycopg2_engine] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __ ERROR at teardown of test_api_get_schema_dtypes[postgresql_psycopg2_conn] ___ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __ ERROR at teardown of test_api_get_schema_keys[postgresql_psycopg2_engine] ___ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ___ ERROR at teardown of test_api_get_schema_keys[postgresql_psycopg2_conn] ____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ___ ERROR at teardown of test_api_chunksize_read[postgresql_psycopg2_engine] ___ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ____ ERROR at teardown of test_api_chunksize_read[postgresql_psycopg2_conn] ____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ____ ERROR at teardown of test_api_categorical[postgresql_psycopg2_engine] _____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _____ ERROR at teardown of test_api_categorical[postgresql_psycopg2_conn] ______ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_unicode_column_name[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_unicode_column_name[postgresql_psycopg2_conn] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_escaped_table_name[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __ ERROR at teardown of test_api_escaped_table_name[postgresql_psycopg2_conn] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_read_sql_duplicate_columns[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_api_read_sql_duplicate_columns[postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ___ ERROR at teardown of test_read_table_columns[postgresql_psycopg2_engine] ___ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ____ ERROR at teardown of test_read_table_columns[postgresql_psycopg2_conn] ____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __ ERROR at teardown of test_read_table_index_col[postgresql_psycopg2_engine] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ___ ERROR at teardown of test_read_table_index_col[postgresql_psycopg2_conn] ___ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_read_sql_delegate[postgresql_psycopg2_engine_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __ ERROR at teardown of test_read_sql_delegate[postgresql_psycopg2_conn_iris] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_warning_case_insensitive_table_name[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_warning_case_insensitive_table_name[postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_type_mapping[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_type_mapping[postgresql_psycopg2_conn] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[int8-SMALLINT-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[int8-SMALLINT-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[Int8-SMALLINT-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[Int8-SMALLINT-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[uint8-SMALLINT-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[uint8-SMALLINT-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[UInt8-SMALLINT-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[UInt8-SMALLINT-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[int16-SMALLINT-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[int16-SMALLINT-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[Int16-SMALLINT-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[Int16-SMALLINT-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[uint16-INTEGER-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[uint16-INTEGER-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[UInt16-INTEGER-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[UInt16-INTEGER-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[int32-INTEGER-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[int32-INTEGER-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[Int32-INTEGER-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[Int32-INTEGER-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[uint32-BIGINT-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[uint32-BIGINT-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[UInt32-BIGINT-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[UInt32-BIGINT-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[int64-BIGINT-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[int64-BIGINT-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[Int64-BIGINT-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[Int64-BIGINT-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[int-BIGINT-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_mapping[int-BIGINT-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_overload_mapping[uint64-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_overload_mapping[uint64-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_overload_mapping[UInt64-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_integer_overload_mapping[UInt64-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __ ERROR at teardown of test_database_uri_string[postgresql_psycopg2_engine] ___ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ___ ERROR at teardown of test_database_uri_string[postgresql_psycopg2_conn] ____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_pg8000_sqlalchemy_passthrough_error[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_pg8000_sqlalchemy_passthrough_error[postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_query_by_text_obj[postgresql_psycopg2_engine_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __ ERROR at teardown of test_query_by_text_obj[postgresql_psycopg2_conn_iris] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_query_by_select_obj[postgresql_psycopg2_engine_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_query_by_select_obj[postgresql_psycopg2_conn_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_column_with_percentage[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __ ERROR at teardown of test_column_with_percentage[postgresql_psycopg2_conn] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ______ ERROR at teardown of test_create_table[postgresql_psycopg2_engine] ______ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _______ ERROR at teardown of test_create_table[postgresql_psycopg2_conn] _______ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _______ ERROR at teardown of test_drop_table[postgresql_psycopg2_engine] _______ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ________ ERROR at teardown of test_drop_table[postgresql_psycopg2_conn] ________ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _______ ERROR at teardown of test_roundtrip[postgresql_psycopg2_engine] ________ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ________ ERROR at teardown of test_roundtrip[postgresql_psycopg2_conn] _________ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s ____ ERROR at teardown of test_execute_sql[postgresql_psycopg2_engine_iris] ____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _____ ERROR at teardown of test_execute_sql[postgresql_psycopg2_conn_iris] _____ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_read_table[postgresql_psycopg2_engine_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_read_table[postgresql_psycopg2_conn_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_read_table_columns[postgresql_psycopg2_engine_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_read_table_columns[postgresql_psycopg2_conn_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_read_table_absent_raises[postgresql_psycopg2_engine_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_read_table_absent_raises[postgresql_psycopg2_conn_iris] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_default_type_conversion[postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_sqlalchemy_default_type_conversion[postgresql_psycopg2_conn_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _________ ERROR at teardown of test_bigint[postgresql_psycopg2_engine] _________ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s __________ ERROR at teardown of test_bigint[postgresql_psycopg2_conn] __________ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_default_date_load[postgresql_psycopg2_engine_types] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_default_date_load[postgresql_psycopg2_conn_types] __ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_datetime_with_timezone_query[None-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_datetime_with_timezone_query[None-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_datetime_with_timezone_query[parse_dates1-postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_datetime_with_timezone_query[parse_dates1-postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_datetime_with_timezone_query_chunksize[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_datetime_with_timezone_query_chunksize[postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_datetime_with_timezone_table[postgresql_psycopg2_engine] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2409s schema["db_schema_name"] 2409s for table in schema["db_schema_tables"]: 2409s if table["table_type"] == "view": 2409s view_name = table["table_name"] 2409s results.append(view_name) 2409s 2409s return results 2409s else: 2409s from sqlalchemy import inspect 2409s 2409s > return inspect(conn).get_view_names() 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s raiseerr = True 2409s 2409s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2409s """Produce an inspection object for the given target. 2409s 2409s The returned value in some cases may be the 2409s same object as the one given, such as if a 2409s :class:`_orm.Mapper` object is passed. In other 2409s cases, it will be an instance of the registered 2409s inspection type for the given object, such as 2409s if an :class:`_engine.Engine` is passed, an 2409s :class:`_reflection.Inspector` object is returned. 2409s 2409s :param subject: the subject to be inspected. 2409s :param raiseerr: When ``True``, if the given subject 2409s does not 2409s correspond to a known SQLAlchemy inspected type, 2409s :class:`sqlalchemy.exc.NoInspectionAvailable` 2409s is raised. If ``False``, ``None`` is returned. 2409s 2409s """ 2409s type_ = type(subject) 2409s for cls in type_.__mro__: 2409s if cls in _registrars: 2409s reg = _registrars.get(cls, None) 2409s if reg is None: 2409s continue 2409s elif reg is True: 2409s return subject 2409s > ret = reg(subject) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @inspection._inspects(Engine) 2409s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2409s > return Inspector._construct(Inspector._init_engine, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s init = 2409s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s @classmethod 2409s def _construct( 2409s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2409s ) -> Inspector: 2409s if hasattr(bind.dialect, "inspector"): 2409s cls = bind.dialect.inspector 2409s 2409s self = cls.__new__(cls) 2409s > init(self, bind) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def _init_engine(self, engine: Engine) -> None: 2409s self.bind = self.engine = engine 2409s > engine.connect().close() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def connect(self) -> Connection: 2409s """Return a new :class:`_engine.Connection` object. 2409s 2409s The :class:`_engine.Connection` acts as a Python context manager, so 2409s the typical use of this method looks like:: 2409s 2409s with engine.connect() as connection: 2409s connection.execute(text("insert into table values ('foo')")) 2409s connection.commit() 2409s 2409s Where above, after the block is completed, the connection is "closed" 2409s and its underlying DBAPI resources are returned to the connection pool. 2409s This also has the effect of rolling back any transaction that 2409s was explicitly begun or was begun via autobegin, and will 2409s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2409s started and is still in progress. 2409s 2409s .. seealso:: 2409s 2409s :meth:`_engine.Engine.begin` 2409s 2409s """ 2409s 2409s > return self._connection_cls(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s self._dbapi_connection = engine.raw_connection() 2409s except dialect.loaded_dbapi.Error as err: 2409s > Connection._handle_dbapi_exception_noconnection( 2409s err, dialect, engine 2409s ) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2409s dialect = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2409s 2409s @classmethod 2409s def _handle_dbapi_exception_noconnection( 2409s cls, 2409s e: BaseException, 2409s dialect: Dialect, 2409s engine: Optional[Engine] = None, 2409s is_disconnect: Optional[bool] = None, 2409s invalidate_pool_on_disconnect: bool = True, 2409s is_pre_ping: bool = False, 2409s ) -> NoReturn: 2409s exc_info = sys.exc_info() 2409s 2409s if is_disconnect is None: 2409s is_disconnect = isinstance( 2409s e, dialect.loaded_dbapi.Error 2409s ) and dialect.is_disconnect(e, None, None) 2409s 2409s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2409s 2409s if should_wrap: 2409s sqlalchemy_exception = exc.DBAPIError.instance( 2409s None, 2409s None, 2409s cast(Exception, e), 2409s dialect.loaded_dbapi.Error, 2409s hide_parameters=( 2409s engine.hide_parameters if engine is not None else False 2409s ), 2409s connection_invalidated=is_disconnect, 2409s dialect=dialect, 2409s ) 2409s else: 2409s sqlalchemy_exception = None 2409s 2409s newraise = None 2409s 2409s if dialect._has_events: 2409s ctx = ExceptionContextImpl( 2409s e, 2409s sqlalchemy_exception, 2409s engine, 2409s dialect, 2409s None, 2409s None, 2409s None, 2409s None, 2409s None, 2409s is_disconnect, 2409s invalidate_pool_on_disconnect, 2409s is_pre_ping, 2409s ) 2409s for fn in dialect.dispatch.handle_error: 2409s try: 2409s # handler returns an exception; 2409s # call next handler in a chain 2409s per_fn = fn(ctx) 2409s if per_fn is not None: 2409s ctx.chained_exception = newraise = per_fn 2409s except Exception as _raised: 2409s # handler raises an exception - stop processing 2409s newraise = _raised 2409s break 2409s 2409s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2409s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2409s ctx.is_disconnect 2409s ) 2409s 2409s if newraise: 2409s raise newraise.with_traceback(exc_info[2]) from e 2409s elif should_wrap: 2409s assert sqlalchemy_exception is not None 2409s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E 2409s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s _ ERROR at teardown of test_datetime_with_timezone_table[postgresql_psycopg2_conn] _ 2409s self = 2409s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s connection = None, _has_events = None, _allow_revalidate = True 2409s _allow_autobegin = True 2409s 2409s def __init__( 2409s self, 2409s engine: Engine, 2409s connection: Optional[PoolProxiedConnection] = None, 2409s _has_events: Optional[bool] = None, 2409s _allow_revalidate: bool = True, 2409s _allow_autobegin: bool = True, 2409s ): 2409s """Construct a new Connection.""" 2409s self.engine = engine 2409s self.dialect = dialect = engine.dialect 2409s 2409s if connection is None: 2409s try: 2409s > self._dbapi_connection = engine.raw_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def raw_connection(self) -> PoolProxiedConnection: 2409s """Return a "raw" DBAPI connection from the connection pool. 2409s 2409s The returned object is a proxied version of the DBAPI 2409s connection object used by the underlying driver in use. 2409s The object will have all the same behavior as the real DBAPI 2409s connection, except that its ``close()`` method will result in the 2409s connection being returned to the pool, rather than being closed 2409s for real. 2409s 2409s This method provides direct DBAPI connection access for 2409s special situations when the API provided by 2409s :class:`_engine.Connection` 2409s is not needed. When a :class:`_engine.Connection` object is already 2409s present, the DBAPI connection is available using 2409s the :attr:`_engine.Connection.connection` accessor. 2409s 2409s .. seealso:: 2409s 2409s :ref:`dbapi_connections` 2409s 2409s """ 2409s > return self.pool.connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def connect(self) -> PoolProxiedConnection: 2409s """Return a DBAPI connection from the pool. 2409s 2409s The connection is instrumented such that when its 2409s ``close()`` method is called, the connection will be returned to 2409s the pool. 2409s 2409s """ 2409s > return _ConnectionFairy._checkout(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s threadconns = None, fairy = None 2409s 2409s @classmethod 2409s def _checkout( 2409s cls, 2409s pool: Pool, 2409s threadconns: Optional[threading.local] = None, 2409s fairy: Optional[_ConnectionFairy] = None, 2409s ) -> _ConnectionFairy: 2409s if not fairy: 2409s > fairy = _ConnectionRecord.checkout(pool) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s cls = 2409s pool = 2409s 2409s @classmethod 2409s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2409s if TYPE_CHECKING: 2409s rec = cast(_ConnectionRecord, pool._do_get()) 2409s else: 2409s > rec = pool._do_get() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _do_get(self) -> ConnectionPoolEntry: 2409s > return self._create_connection() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def _create_connection(self) -> ConnectionPoolEntry: 2409s """Called by subclasses to create a new ConnectionRecord.""" 2409s 2409s > return _ConnectionRecord(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s pool = , connect = True 2409s 2409s def __init__(self, pool: Pool, connect: bool = True): 2409s self.fresh = False 2409s self.fairy_ref = None 2409s self.starttime = 0 2409s self.dbapi_connection = None 2409s 2409s self.__pool = pool 2409s if connect: 2409s > self.__connect() 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s self.dbapi_connection = connection = pool._invoke_creator(self) 2409s pool.logger.debug("Created new connection %r", connection) 2409s self.fresh = True 2409s except BaseException as e: 2409s > with util.safe_reraise(): 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s type_ = None, value = None, traceback = None 2409s 2409s def __exit__( 2409s self, 2409s type_: Optional[Type[BaseException]], 2409s value: Optional[BaseException], 2409s traceback: Optional[types.TracebackType], 2409s ) -> NoReturn: 2409s assert self._exc_info is not None 2409s # see #2703 for notes 2409s if type_ is None: 2409s exc_type, exc_value, exc_tb = self._exc_info 2409s assert exc_value is not None 2409s self._exc_info = None # remove potential circular references 2409s > raise exc_value.with_traceback(exc_tb) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s 2409s def __connect(self) -> None: 2409s pool = self.__pool 2409s 2409s # ensure any existing connection is removed, so that if 2409s # creator fails, this attribute stays None 2409s self.dbapi_connection = None 2409s try: 2409s self.starttime = time.time() 2409s > self.dbapi_connection = connection = pool._invoke_creator(self) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s connection_record = 2409s 2409s def connect( 2409s connection_record: Optional[ConnectionPoolEntry] = None, 2409s ) -> DBAPIConnection: 2409s if dialect._has_events: 2409s for fn in dialect.dispatch.do_connect: 2409s connection = cast( 2409s DBAPIConnection, 2409s fn(dialect, connection_record, cargs, cparams), 2409s ) 2409s if connection is not None: 2409s return connection 2409s 2409s > return dialect.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s self = 2409s cargs = () 2409s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s 2409s def connect(self, *cargs, **cparams): 2409s # inherits the docstring from interfaces.Dialect.connect 2409s > return self.loaded_dbapi.connect(*cargs, **cparams) 2409s 2409s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2409s connection_factory = None, cursor_factory = None 2409s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2409s kwasync = {} 2409s 2409s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2409s """ 2409s Create a new database connection. 2409s 2409s The connection parameters can be specified as a string: 2409s 2409s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2409s 2409s or using a set of keyword arguments: 2409s 2409s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2409s 2409s Or as a mix of both. The basic connection parameters are: 2409s 2409s - *dbname*: the database name 2409s - *database*: the database name (only as keyword argument) 2409s - *user*: user name used to authenticate 2409s - *password*: password used to authenticate 2409s - *host*: database host address (defaults to UNIX socket if not provided) 2409s - *port*: connection port number (defaults to 5432 if not provided) 2409s 2409s Using the *connection_factory* parameter a different class or connections 2409s factory can be specified. It should be a callable object taking a dsn 2409s argument. 2409s 2409s Using the *cursor_factory* parameter, a new default cursor factory will be 2409s used by cursor(). 2409s 2409s Using *async*=True an asynchronous connection will be created. *async_* is 2409s a valid alias (for Python versions where ``async`` is a keyword). 2409s 2409s Any other keyword parameter will be passed to the underlying client 2409s library: the list of supported parameters depends on the library version. 2409s 2409s """ 2409s kwasync = {} 2409s if 'async' in kwargs: 2409s kwasync['async'] = kwargs.pop('async') 2409s if 'async_' in kwargs: 2409s kwasync['async_'] = kwargs.pop('async_') 2409s 2409s dsn = _ext.make_dsn(dsn, **kwargs) 2409s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2409s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2409s E Is the server running on that host and accepting TCP/IP connections? 2409s 2409s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2409s 2409s The above exception was the direct cause of the following exception: 2409s 2409s @pytest.fixture 2409s def postgresql_psycopg2_engine(): 2409s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2409s td.versioned_importorskip("psycopg2") 2409s engine = sqlalchemy.create_engine( 2409s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2409s poolclass=sqlalchemy.pool.NullPool, 2409s ) 2409s yield engine 2409s > for view in get_all_views(engine): 2409s 2409s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2409s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2409s 2409s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2409s 2409s def get_all_views(conn): 2409s if isinstance(conn, sqlite3.Connection): 2409s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2409s return [view[0] for view in c.fetchall()] 2409s else: 2409s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2409s if adbc and isinstance(conn, adbc.Connection): 2409s results = [] 2409s info = conn.adbc_get_objects().read_all().to_pylist() 2409s for catalog in info: 2409s catalog["catalog_name"] 2409s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_datetime_with_timezone_roundtrip[postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_datetime_with_timezone_roundtrip[postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_out_of_bounds_datetime[postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __ ERROR at teardown of test_out_of_bounds_datetime[postgresql_psycopg2_conn] __ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_naive_datetimeindex_roundtrip[postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_naive_datetimeindex_roundtrip[postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ___ ERROR at teardown of test_date_parsing[postgresql_psycopg2_engine_types] ___ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____ ERROR at teardown of test_date_parsing[postgresql_psycopg2_conn_types] ____ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ________ ERROR at teardown of test_datetime[postgresql_psycopg2_engine] ________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _________ ERROR at teardown of test_datetime[postgresql_psycopg2_conn] _________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______ ERROR at teardown of test_datetime_NaT[postgresql_psycopg2_engine] ______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _______ ERROR at teardown of test_datetime_NaT[postgresql_psycopg2_conn] _______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _____ ERROR at teardown of test_datetime_date[postgresql_psycopg2_engine] ______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______ ERROR at teardown of test_datetime_date[postgresql_psycopg2_conn] _______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _____ ERROR at teardown of test_datetime_time[postgresql_psycopg2_engine] ______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______ ERROR at teardown of test_datetime_time[postgresql_psycopg2_conn] _______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ___ ERROR at teardown of test_mixed_dtype_insert[postgresql_psycopg2_engine] ___ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____ ERROR at teardown of test_mixed_dtype_insert[postgresql_psycopg2_conn] ____ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______ ERROR at teardown of test_nan_numeric[postgresql_psycopg2_engine] _______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _______ ERROR at teardown of test_nan_numeric[postgresql_psycopg2_conn] ________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _____ ERROR at teardown of test_nan_fullcolumn[postgresql_psycopg2_engine] _____ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______ ERROR at teardown of test_nan_fullcolumn[postgresql_psycopg2_conn] ______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _______ ERROR at teardown of test_nan_string[postgresql_psycopg2_engine] _______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ________ ERROR at teardown of test_nan_string[postgresql_psycopg2_conn] ________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ___ ERROR at teardown of test_to_sql_save_index[postgresql_psycopg2_engine] ____ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____ ERROR at teardown of test_to_sql_save_index[postgresql_psycopg2_conn] _____ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______ ERROR at teardown of test_transactions[postgresql_psycopg2_engine] ______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _______ ERROR at teardown of test_transactions[postgresql_psycopg2_conn] _______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __ ERROR at teardown of test_transaction_rollback[postgresql_psycopg2_engine] __ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ___ ERROR at teardown of test_transaction_rollback[postgresql_psycopg2_conn] ___ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_get_schema_create_table[postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_get_schema_create_table[postgresql_psycopg2_conn] __ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _________ ERROR at teardown of test_dtype[postgresql_psycopg2_engine] __________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __________ ERROR at teardown of test_dtype[postgresql_psycopg2_conn] ___________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______ ERROR at teardown of test_notna_dtype[postgresql_psycopg2_engine] _______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _______ ERROR at teardown of test_notna_dtype[postgresql_psycopg2_conn] ________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____ ERROR at teardown of test_double_precision[postgresql_psycopg2_engine] ____ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _____ ERROR at teardown of test_double_precision[postgresql_psycopg2_conn] _____ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_connectable_issue_example[postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_connectable_issue_example[postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_to_sql_with_negative_npinf[input0-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_to_sql_with_negative_npinf[input0-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_to_sql_with_negative_npinf[input1-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_to_sql_with_negative_npinf[input1-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_to_sql_with_negative_npinf[input2-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_to_sql_with_negative_npinf[input2-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____ ERROR at teardown of test_temporary_table[postgresql_psycopg2_engine] _____ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _____ ERROR at teardown of test_temporary_table[postgresql_psycopg2_conn] ______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _____ ERROR at teardown of test_invalid_engine[postgresql_psycopg2_engine] _____ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______ ERROR at teardown of test_invalid_engine[postgresql_psycopg2_conn] ______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_to_sql_with_sql_engine[postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __ ERROR at teardown of test_to_sql_with_sql_engine[postgresql_psycopg2_conn] __ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ___ ERROR at teardown of test_options_sqlalchemy[postgresql_psycopg2_engine] ___ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____ ERROR at teardown of test_options_sqlalchemy[postgresql_psycopg2_conn] ____ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______ ERROR at teardown of test_options_auto[postgresql_psycopg2_engine] ______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _______ ERROR at teardown of test_options_auto[postgresql_psycopg2_conn] _______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype_backend[python-numpy_nullable-read_sql-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype_backend[python-numpy_nullable-read_sql-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype_backend[python-numpy_nullable-read_sql_query-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype_backend[python-numpy_nullable-read_sql_query-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql_table-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql_table-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_invalid_dtype_backend_table[read_sql-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_invalid_dtype_backend_table[read_sql-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_invalid_dtype_backend_table[read_sql_table-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_invalid_dtype_backend_table[read_sql_table-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_invalid_dtype_backend_table[read_sql_query-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_invalid_dtype_backend_table[read_sql_query-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_chunksize_empty_dtypes[postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __ ERROR at teardown of test_chunksize_empty_dtypes[postgresql_psycopg2_conn] __ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype[read_sql-_NoDefault.no_default-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype[read_sql-_NoDefault.no_default-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype[read_sql-numpy_nullable-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype[read_sql-numpy_nullable-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype[read_sql_query-_NoDefault.no_default-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype[read_sql_query-_NoDefault.no_default-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype[read_sql_query-numpy_nullable-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ ERROR at teardown of test_read_sql_dtype[read_sql_query-numpy_nullable-postgresql_psycopg2_conn] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______________ ERROR at teardown of test_psycopg2_schema_support _______________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _______________ ERROR at teardown of test_self_join_date_columns _______________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s td.versioned_importorskip("psycopg2") 2410s engine = sqlalchemy.create_engine( 2410s "postgresql+psycopg2://postgres:postgres@localhost:5432/pandas", 2410s poolclass=sqlalchemy.pool.NullPool, 2410s ) 2410s yield engine 2410s > for view in get_all_views(engine): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:659: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def get_all_views(conn): 2410s if isinstance(conn, sqlite3.Connection): 2410s c = conn.execute("SELECT name FROM sqlite_master WHERE type='view'") 2410s return [view[0] for view in c.fetchall()] 2410s else: 2410s adbc = import_optional_dependency("adbc_driver_manager.dbapi", errors="ignore") 2410s if adbc and isinstance(conn, adbc.Connection): 2410s results = [] 2410s info = conn.adbc_get_objects().read_all().to_pylist() 2410s for catalog in info: 2410s catalog["catalog_name"] 2410s for schema in catalog["catalog_db_schemas"]: 2410s schema["db_schema_name"] 2410s for table in schema["db_schema_tables"]: 2410s if table["table_type"] == "view": 2410s view_name = table["table_name"] 2410s results.append(view_name) 2410s 2410s return results 2410s else: 2410s from sqlalchemy import inspect 2410s 2410s > return inspect(conn).get_view_names() 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:533: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s subject = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s raiseerr = True 2410s 2410s def inspect(subject: Any, raiseerr: bool = True) -> Any: 2410s """Produce an inspection object for the given target. 2410s 2410s The returned value in some cases may be the 2410s same object as the one given, such as if a 2410s :class:`_orm.Mapper` object is passed. In other 2410s cases, it will be an instance of the registered 2410s inspection type for the given object, such as 2410s if an :class:`_engine.Engine` is passed, an 2410s :class:`_reflection.Inspector` object is returned. 2410s 2410s :param subject: the subject to be inspected. 2410s :param raiseerr: When ``True``, if the given subject 2410s does not 2410s correspond to a known SQLAlchemy inspected type, 2410s :class:`sqlalchemy.exc.NoInspectionAvailable` 2410s is raised. If ``False``, ``None`` is returned. 2410s 2410s """ 2410s type_ = type(subject) 2410s for cls in type_.__mro__: 2410s if cls in _registrars: 2410s reg = _registrars.get(cls, None) 2410s if reg is None: 2410s continue 2410s elif reg is True: 2410s return subject 2410s > ret = reg(subject) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/inspection.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @inspection._inspects(Engine) 2410s def _engine_insp(bind: Engine) -> Inspector: # type: ignore[misc] 2410s > return Inspector._construct(Inspector._init_engine, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:303: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s init = 2410s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @classmethod 2410s def _construct( 2410s cls, init: Callable[..., Any], bind: Union[Engine, Connection] 2410s ) -> Inspector: 2410s if hasattr(bind.dialect, "inspector"): 2410s cls = bind.dialect.inspector 2410s 2410s self = cls.__new__(cls) 2410s > init(self, bind) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:236: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def _init_engine(self, engine: Engine) -> None: 2410s self.bind = self.engine = engine 2410s > engine.connect().close() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/reflection.py:247: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s =================================== FAILURES =================================== 2410s _________________ test_dataframe_to_sql[mysql_pymysql_engine] __________________ 2410s conn = 'mysql_pymysql_engine' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_dataframe_to_sql(conn, test_frame1, request): 2410s # GH 51086 if conn is sqlite_engine 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:982: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eaff05f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eaff05f0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406297e0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s __________________ test_dataframe_to_sql[mysql_pymysql_conn] ___________________ 2410s conn = 'mysql_pymysql_conn' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_dataframe_to_sql(conn, test_frame1, request): 2410s # GH 51086 if conn is sqlite_engine 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:982: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eaff1790>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eaff1790> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ______________ test_dataframe_to_sql[postgresql_psycopg2_engine] _______________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_dataframe_to_sql(conn, test_frame1, request): 2410s # GH 51086 if conn is sqlite_engine 2410s conn = request.getfixturevalue(conn) 2410s > test_frame1.to_sql(name="test", con=conn, if_exists="append", index=False) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:983: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ( index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -...01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253,) 2410s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'append', 'index': False, 'name': 'test'} 2410s 2410s @wraps(func) 2410s def wrapper(*args, **kwargs): 2410s if len(args) > num_allow_args: 2410s warnings.warn( 2410s msg.format(arguments=_format_argument_list(allow_args)), 2410s FutureWarning, 2410s stacklevel=find_stack_level(), 2410s ) 2410s > return func(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s name = 'test' 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, if_exists = 'append', index = False, index_label = None 2410s chunksize = None, dtype = None, method = None 2410s 2410s @final 2410s @deprecate_nonkeyword_arguments( 2410s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2410s ) 2410s def to_sql( 2410s self, 2410s name: str, 2410s con, 2410s schema: str | None = None, 2410s if_exists: Literal["fail", "replace", "append"] = "fail", 2410s index: bool_t = True, 2410s index_label: IndexLabel | None = None, 2410s chunksize: int | None = None, 2410s dtype: DtypeArg | None = None, 2410s method: Literal["multi"] | Callable | None = None, 2410s ) -> int | None: 2410s """ 2410s Write records stored in a DataFrame to a SQL database. 2410s 2410s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2410s newly created, appended to, or overwritten. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s Name of SQL table. 2410s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2410s Using SQLAlchemy makes it possible to use any DB supported by that 2410s library. Legacy support is provided for sqlite3.Connection objects. The user 2410s is responsible for engine disposal and connection closure for the SQLAlchemy 2410s connectable. See `here \ 2410s `_. 2410s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2410s the transaction will not be committed. If passing a sqlite3.Connection, 2410s it will not be possible to roll back the record insertion. 2410s 2410s schema : str, optional 2410s Specify the schema (if database flavor supports this). If None, use 2410s default schema. 2410s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2410s How to behave if the table already exists. 2410s 2410s * fail: Raise a ValueError. 2410s * replace: Drop the table before inserting new values. 2410s * append: Insert new values to the existing table. 2410s 2410s index : bool, default True 2410s Write DataFrame index as a column. Uses `index_label` as the column 2410s name in the table. Creates a table index for this column. 2410s index_label : str or sequence, default None 2410s Column label for index column(s). If None is given (default) and 2410s `index` is True, then the index names are used. 2410s A sequence should be given if the DataFrame uses MultiIndex. 2410s chunksize : int, optional 2410s Specify the number of rows in each batch to be written at a time. 2410s By default, all rows will be written at once. 2410s dtype : dict or scalar, optional 2410s Specifying the datatype for columns. If a dictionary is used, the 2410s keys should be the column names and the values should be the 2410s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2410s scalar is provided, it will be applied to all columns. 2410s method : {None, 'multi', callable}, optional 2410s Controls the SQL insertion clause used: 2410s 2410s * None : Uses standard SQL ``INSERT`` clause (one per row). 2410s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2410s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2410s 2410s Details and a sample callable implementation can be found in the 2410s section :ref:`insert method `. 2410s 2410s Returns 2410s ------- 2410s None or int 2410s Number of rows affected by to_sql. None is returned if the callable 2410s passed into ``method`` does not return an integer number of rows. 2410s 2410s The number of returned rows affected is the sum of the ``rowcount`` 2410s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2410s reflect the exact number of written rows as stipulated in the 2410s `sqlite3 `__ or 2410s `SQLAlchemy `__. 2410s 2410s .. versionadded:: 1.4.0 2410s 2410s Raises 2410s ------ 2410s ValueError 2410s When the table already exists and `if_exists` is 'fail' (the 2410s default). 2410s 2410s See Also 2410s -------- 2410s read_sql : Read a DataFrame from a table. 2410s 2410s Notes 2410s ----- 2410s Timezone aware datetime columns will be written as 2410s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2410s database. Otherwise, the datetimes will be stored as timezone unaware 2410s timestamps local to the original timezone. 2410s 2410s Not all datastores support ``method="multi"``. Oracle, for example, 2410s does not support multi-value insert. 2410s 2410s References 2410s ---------- 2410s .. [1] https://docs.sqlalchemy.org 2410s .. [2] https://www.python.org/dev/peps/pep-0249/ 2410s 2410s Examples 2410s -------- 2410s Create an in-memory SQLite database. 2410s 2410s >>> from sqlalchemy import create_engine 2410s >>> engine = create_engine('sqlite://', echo=False) 2410s 2410s Create a table from scratch with 3 rows. 2410s 2410s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2410s >>> df 2410s name 2410s 0 User 1 2410s 1 User 2 2410s 2 User 3 2410s 2410s >>> df.to_sql(name='users', con=engine) 2410s 3 2410s >>> from sqlalchemy import text 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2410s 2410s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2410s 2410s >>> with engine.begin() as connection: 2410s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2410s ... df1.to_sql(name='users', con=connection, if_exists='append') 2410s 2 2410s 2410s This is allowed to support operations that require that the same 2410s DBAPI connection is used for the entire operation. 2410s 2410s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2410s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2410s 2 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2410s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2410s (1, 'User 7')] 2410s 2410s Overwrite the table with just ``df2``. 2410s 2410s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2410s ... index_label='id') 2410s 2 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 6'), (1, 'User 7')] 2410s 2410s Use ``method`` to define a callable insertion method to do nothing 2410s if there's a primary key conflict on a table in a PostgreSQL database. 2410s 2410s >>> from sqlalchemy.dialects.postgresql import insert 2410s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2410s ... # "a" is the primary key in "conflict_table" 2410s ... data = [dict(zip(keys, row)) for row in data_iter] 2410s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2410s ... result = conn.execute(stmt) 2410s ... return result.rowcount 2410s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2410s 0 2410s 2410s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2410s on a primary key. 2410s 2410s >>> from sqlalchemy.dialects.mysql import insert 2410s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2410s ... # update columns "b" and "c" on primary key conflict 2410s ... data = [dict(zip(keys, row)) for row in data_iter] 2410s ... stmt = ( 2410s ... insert(table.table) 2410s ... .values(data) 2410s ... ) 2410s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2410s ... result = conn.execute(stmt) 2410s ... return result.rowcount 2410s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2410s 2 2410s 2410s Specify the dtype (especially useful for integers with missing values). 2410s Notice that while pandas is forced to store the data as floating point, 2410s the database supports nullable integers. When fetching the data with 2410s Python, we get back integer scalars. 2410s 2410s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2410s >>> df 2410s A 2410s 0 1.0 2410s 1 NaN 2410s 2 2.0 2410s 2410s >>> from sqlalchemy.types import Integer 2410s >>> df.to_sql(name='integers', con=engine, index=False, 2410s ... dtype={"A": Integer()}) 2410s 3 2410s 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2410s [(1,), (None,), (2,)] 2410s """ # noqa: E501 2410s from pandas.io import sql 2410s 2410s > return sql.to_sql( 2410s self, 2410s name, 2410s con, 2410s schema=schema, 2410s if_exists=if_exists, 2410s index=index, 2410s index_label=index_label, 2410s chunksize=chunksize, 2410s dtype=dtype, 2410s method=method, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s frame = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s name = 'test' 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, if_exists = 'append', index = False, index_label = None 2410s chunksize = None, dtype = None, method = None, engine = 'auto' 2410s engine_kwargs = {} 2410s 2410s def to_sql( 2410s frame, 2410s name: str, 2410s con, 2410s schema: str | None = None, 2410s if_exists: Literal["fail", "replace", "append"] = "fail", 2410s index: bool = True, 2410s index_label: IndexLabel | None = None, 2410s chunksize: int | None = None, 2410s dtype: DtypeArg | None = None, 2410s method: Literal["multi"] | Callable | None = None, 2410s engine: str = "auto", 2410s **engine_kwargs, 2410s ) -> int | None: 2410s """ 2410s Write records stored in a DataFrame to a SQL database. 2410s 2410s Parameters 2410s ---------- 2410s frame : DataFrame, Series 2410s name : str 2410s Name of SQL table. 2410s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2410s or sqlite3 DBAPI2 connection 2410s ADBC provides high performance I/O with native type support, where available. 2410s Using SQLAlchemy makes it possible to use any DB supported by that 2410s library. 2410s If a DBAPI2 object, only sqlite3 is supported. 2410s schema : str, optional 2410s Name of SQL schema in database to write to (if database flavor 2410s supports this). If None, use default schema (default). 2410s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2410s - fail: If table exists, do nothing. 2410s - replace: If table exists, drop it, recreate it, and insert data. 2410s - append: If table exists, insert data. Create if does not exist. 2410s index : bool, default True 2410s Write DataFrame index as a column. 2410s index_label : str or sequence, optional 2410s Column label for index column(s). If None is given (default) and 2410s `index` is True, then the index names are used. 2410s A sequence should be given if the DataFrame uses MultiIndex. 2410s chunksize : int, optional 2410s Specify the number of rows in each batch to be written at a time. 2410s By default, all rows will be written at once. 2410s dtype : dict or scalar, optional 2410s Specifying the datatype for columns. If a dictionary is used, the 2410s keys should be the column names and the values should be the 2410s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2410s scalar is provided, it will be applied to all columns. 2410s method : {None, 'multi', callable}, optional 2410s Controls the SQL insertion clause used: 2410s 2410s - None : Uses standard SQL ``INSERT`` clause (one per row). 2410s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2410s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2410s 2410s Details and a sample callable implementation can be found in the 2410s section :ref:`insert method `. 2410s engine : {'auto', 'sqlalchemy'}, default 'auto' 2410s SQL engine library to use. If 'auto', then the option 2410s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2410s behavior is 'sqlalchemy' 2410s 2410s .. versionadded:: 1.3.0 2410s 2410s **engine_kwargs 2410s Any additional kwargs are passed to the engine. 2410s 2410s Returns 2410s ------- 2410s None or int 2410s Number of rows affected by to_sql. None is returned if the callable 2410s passed into ``method`` does not return an integer number of rows. 2410s 2410s .. versionadded:: 1.4.0 2410s 2410s Notes 2410s ----- 2410s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2410s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2410s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2410s rows as stipulated in the 2410s `sqlite3 `__ or 2410s `SQLAlchemy `__ 2410s """ # noqa: E501 2410s if if_exists not in ("fail", "replace", "append"): 2410s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2410s 2410s if isinstance(frame, Series): 2410s frame = frame.to_frame() 2410s elif not isinstance(frame, DataFrame): 2410s raise NotImplementedError( 2410s "'frame' argument should be either a Series or a DataFrame" 2410s ) 2410s 2410s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def pandasSQL_builder( 2410s con, 2410s schema: str | None = None, 2410s need_transaction: bool = False, 2410s ) -> PandasSQL: 2410s """ 2410s Convenience function to return the correct PandasSQL subclass based on the 2410s provided parameters. Also creates a sqlalchemy connection and transaction 2410s if necessary. 2410s """ 2410s import sqlite3 2410s 2410s if isinstance(con, sqlite3.Connection) or con is None: 2410s return SQLiteDatabase(con) 2410s 2410s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2410s 2410s if isinstance(con, str) and sqlalchemy is None: 2410s raise ImportError("Using URI string without sqlalchemy installed.") 2410s 2410s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2410s > return SQLDatabase(con, schema, need_transaction) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def __init__( 2410s self, con, schema: str | None = None, need_transaction: bool = False 2410s ) -> None: 2410s from sqlalchemy import create_engine 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.schema import MetaData 2410s 2410s # self.exit_stack cleans up the Engine and Connection and commits the 2410s # transaction if any of those objects was created below. 2410s # Cleanup happens either in self.__exit__ or at the end of the iterator 2410s # returned by read_sql when chunksize is not None. 2410s self.exit_stack = ExitStack() 2410s if isinstance(con, str): 2410s con = create_engine(con) 2410s self.exit_stack.callback(con.dispose) 2410s if isinstance(con, Engine): 2410s > con = self.exit_stack.enter_context(con.connect()) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _______________ test_dataframe_to_sql[postgresql_psycopg2_conn] ________________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_dataframe_to_sql(conn, test_frame1, request): 2410s # GH 51086 if conn is sqlite_engine 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:982: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______________ test_dataframe_to_sql_empty[mysql_pymysql_engine] _______________ 2410s conn = 'mysql_pymysql_engine' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_dataframe_to_sql_empty(conn, test_frame1, request): 2410s if conn == "postgresql_adbc_conn": 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="postgres ADBC driver cannot insert index with null type", 2410s strict=True, 2410s ) 2410s ) 2410s # GH 51086 if conn is sqlite_engine 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:996: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eaff2090>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eaff2090> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _______________ test_dataframe_to_sql_empty[mysql_pymysql_conn] ________________ 2410s conn = 'mysql_pymysql_conn' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_dataframe_to_sql_empty(conn, test_frame1, request): 2410s if conn == "postgresql_adbc_conn": 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="postgres ADBC driver cannot insert index with null type", 2410s strict=True, 2410s ) 2410s ) 2410s # GH 51086 if conn is sqlite_engine 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:996: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eaff21b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eaff21b0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ___________ test_dataframe_to_sql_empty[postgresql_psycopg2_engine] ____________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_dataframe_to_sql_empty(conn, test_frame1, request): 2410s if conn == "postgresql_adbc_conn": 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="postgres ADBC driver cannot insert index with null type", 2410s strict=True, 2410s ) 2410s ) 2410s # GH 51086 if conn is sqlite_engine 2410s conn = request.getfixturevalue(conn) 2410s empty_df = test_frame1.iloc[:0] 2410s > empty_df.to_sql(name="test", con=conn, if_exists="append", index=False) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:998: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = (Empty DataFrame 2410s Columns: [index, A, B, C, D] 2410s Index: [],) 2410s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'append', 'index': False, 'name': 'test'} 2410s 2410s @wraps(func) 2410s def wrapper(*args, **kwargs): 2410s if len(args) > num_allow_args: 2410s warnings.warn( 2410s msg.format(arguments=_format_argument_list(allow_args)), 2410s FutureWarning, 2410s stacklevel=find_stack_level(), 2410s ) 2410s > return func(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Empty DataFrame 2410s Columns: [index, A, B, C, D] 2410s Index: [], name = 'test' 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, if_exists = 'append', index = False, index_label = None 2410s chunksize = None, dtype = None, method = None 2410s 2410s @final 2410s @deprecate_nonkeyword_arguments( 2410s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2410s ) 2410s def to_sql( 2410s self, 2410s name: str, 2410s con, 2410s schema: str | None = None, 2410s if_exists: Literal["fail", "replace", "append"] = "fail", 2410s index: bool_t = True, 2410s index_label: IndexLabel | None = None, 2410s chunksize: int | None = None, 2410s dtype: DtypeArg | None = None, 2410s method: Literal["multi"] | Callable | None = None, 2410s ) -> int | None: 2410s """ 2410s Write records stored in a DataFrame to a SQL database. 2410s 2410s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2410s newly created, appended to, or overwritten. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s Name of SQL table. 2410s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2410s Using SQLAlchemy makes it possible to use any DB supported by that 2410s library. Legacy support is provided for sqlite3.Connection objects. The user 2410s is responsible for engine disposal and connection closure for the SQLAlchemy 2410s connectable. See `here \ 2410s `_. 2410s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2410s the transaction will not be committed. If passing a sqlite3.Connection, 2410s it will not be possible to roll back the record insertion. 2410s 2410s schema : str, optional 2410s Specify the schema (if database flavor supports this). If None, use 2410s default schema. 2410s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2410s How to behave if the table already exists. 2410s 2410s * fail: Raise a ValueError. 2410s * replace: Drop the table before inserting new values. 2410s * append: Insert new values to the existing table. 2410s 2410s index : bool, default True 2410s Write DataFrame index as a column. Uses `index_label` as the column 2410s name in the table. Creates a table index for this column. 2410s index_label : str or sequence, default None 2410s Column label for index column(s). If None is given (default) and 2410s `index` is True, then the index names are used. 2410s A sequence should be given if the DataFrame uses MultiIndex. 2410s chunksize : int, optional 2410s Specify the number of rows in each batch to be written at a time. 2410s By default, all rows will be written at once. 2410s dtype : dict or scalar, optional 2410s Specifying the datatype for columns. If a dictionary is used, the 2410s keys should be the column names and the values should be the 2410s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2410s scalar is provided, it will be applied to all columns. 2410s method : {None, 'multi', callable}, optional 2410s Controls the SQL insertion clause used: 2410s 2410s * None : Uses standard SQL ``INSERT`` clause (one per row). 2410s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2410s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2410s 2410s Details and a sample callable implementation can be found in the 2410s section :ref:`insert method `. 2410s 2410s Returns 2410s ------- 2410s None or int 2410s Number of rows affected by to_sql. None is returned if the callable 2410s passed into ``method`` does not return an integer number of rows. 2410s 2410s The number of returned rows affected is the sum of the ``rowcount`` 2410s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2410s reflect the exact number of written rows as stipulated in the 2410s `sqlite3 `__ or 2410s `SQLAlchemy `__. 2410s 2410s .. versionadded:: 1.4.0 2410s 2410s Raises 2410s ------ 2410s ValueError 2410s When the table already exists and `if_exists` is 'fail' (the 2410s default). 2410s 2410s See Also 2410s -------- 2410s read_sql : Read a DataFrame from a table. 2410s 2410s Notes 2410s ----- 2410s Timezone aware datetime columns will be written as 2410s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2410s database. Otherwise, the datetimes will be stored as timezone unaware 2410s timestamps local to the original timezone. 2410s 2410s Not all datastores support ``method="multi"``. Oracle, for example, 2410s does not support multi-value insert. 2410s 2410s References 2410s ---------- 2410s .. [1] https://docs.sqlalchemy.org 2410s .. [2] https://www.python.org/dev/peps/pep-0249/ 2410s 2410s Examples 2410s -------- 2410s Create an in-memory SQLite database. 2410s 2410s >>> from sqlalchemy import create_engine 2410s >>> engine = create_engine('sqlite://', echo=False) 2410s 2410s Create a table from scratch with 3 rows. 2410s 2410s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2410s >>> df 2410s name 2410s 0 User 1 2410s 1 User 2 2410s 2 User 3 2410s 2410s >>> df.to_sql(name='users', con=engine) 2410s 3 2410s >>> from sqlalchemy import text 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2410s 2410s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2410s 2410s >>> with engine.begin() as connection: 2410s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2410s ... df1.to_sql(name='users', con=connection, if_exists='append') 2410s 2 2410s 2410s This is allowed to support operations that require that the same 2410s DBAPI connection is used for the entire operation. 2410s 2410s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2410s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2410s 2 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2410s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2410s (1, 'User 7')] 2410s 2410s Overwrite the table with just ``df2``. 2410s 2410s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2410s ... index_label='id') 2410s 2 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 6'), (1, 'User 7')] 2410s 2410s Use ``method`` to define a callable insertion method to do nothing 2410s if there's a primary key conflict on a table in a PostgreSQL database. 2410s 2410s >>> from sqlalchemy.dialects.postgresql import insert 2410s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2410s ... # "a" is the primary key in "conflict_table" 2410s ... data = [dict(zip(keys, row)) for row in data_iter] 2410s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2410s ... result = conn.execute(stmt) 2410s ... return result.rowcount 2410s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2410s 0 2410s 2410s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2410s on a primary key. 2410s 2410s >>> from sqlalchemy.dialects.mysql import insert 2410s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2410s ... # update columns "b" and "c" on primary key conflict 2410s ... data = [dict(zip(keys, row)) for row in data_iter] 2410s ... stmt = ( 2410s ... insert(table.table) 2410s ... .values(data) 2410s ... ) 2410s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2410s ... result = conn.execute(stmt) 2410s ... return result.rowcount 2410s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2410s 2 2410s 2410s Specify the dtype (especially useful for integers with missing values). 2410s Notice that while pandas is forced to store the data as floating point, 2410s the database supports nullable integers. When fetching the data with 2410s Python, we get back integer scalars. 2410s 2410s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2410s >>> df 2410s A 2410s 0 1.0 2410s 1 NaN 2410s 2 2.0 2410s 2410s >>> from sqlalchemy.types import Integer 2410s >>> df.to_sql(name='integers', con=engine, index=False, 2410s ... dtype={"A": Integer()}) 2410s 3 2410s 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2410s [(1,), (None,), (2,)] 2410s """ # noqa: E501 2410s from pandas.io import sql 2410s 2410s > return sql.to_sql( 2410s self, 2410s name, 2410s con, 2410s schema=schema, 2410s if_exists=if_exists, 2410s index=index, 2410s index_label=index_label, 2410s chunksize=chunksize, 2410s dtype=dtype, 2410s method=method, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s frame = Empty DataFrame 2410s Columns: [index, A, B, C, D] 2410s Index: [], name = 'test' 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, if_exists = 'append', index = False, index_label = None 2410s chunksize = None, dtype = None, method = None, engine = 'auto' 2410s engine_kwargs = {} 2410s 2410s def to_sql( 2410s frame, 2410s name: str, 2410s con, 2410s schema: str | None = None, 2410s if_exists: Literal["fail", "replace", "append"] = "fail", 2410s index: bool = True, 2410s index_label: IndexLabel | None = None, 2410s chunksize: int | None = None, 2410s dtype: DtypeArg | None = None, 2410s method: Literal["multi"] | Callable | None = None, 2410s engine: str = "auto", 2410s **engine_kwargs, 2410s ) -> int | None: 2410s """ 2410s Write records stored in a DataFrame to a SQL database. 2410s 2410s Parameters 2410s ---------- 2410s frame : DataFrame, Series 2410s name : str 2410s Name of SQL table. 2410s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2410s or sqlite3 DBAPI2 connection 2410s ADBC provides high performance I/O with native type support, where available. 2410s Using SQLAlchemy makes it possible to use any DB supported by that 2410s library. 2410s If a DBAPI2 object, only sqlite3 is supported. 2410s schema : str, optional 2410s Name of SQL schema in database to write to (if database flavor 2410s supports this). If None, use default schema (default). 2410s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2410s - fail: If table exists, do nothing. 2410s - replace: If table exists, drop it, recreate it, and insert data. 2410s - append: If table exists, insert data. Create if does not exist. 2410s index : bool, default True 2410s Write DataFrame index as a column. 2410s index_label : str or sequence, optional 2410s Column label for index column(s). If None is given (default) and 2410s `index` is True, then the index names are used. 2410s A sequence should be given if the DataFrame uses MultiIndex. 2410s chunksize : int, optional 2410s Specify the number of rows in each batch to be written at a time. 2410s By default, all rows will be written at once. 2410s dtype : dict or scalar, optional 2410s Specifying the datatype for columns. If a dictionary is used, the 2410s keys should be the column names and the values should be the 2410s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2410s scalar is provided, it will be applied to all columns. 2410s method : {None, 'multi', callable}, optional 2410s Controls the SQL insertion clause used: 2410s 2410s - None : Uses standard SQL ``INSERT`` clause (one per row). 2410s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2410s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2410s 2410s Details and a sample callable implementation can be found in the 2410s section :ref:`insert method `. 2410s engine : {'auto', 'sqlalchemy'}, default 'auto' 2410s SQL engine library to use. If 'auto', then the option 2410s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2410s behavior is 'sqlalchemy' 2410s 2410s .. versionadded:: 1.3.0 2410s 2410s **engine_kwargs 2410s Any additional kwargs are passed to the engine. 2410s 2410s Returns 2410s ------- 2410s None or int 2410s Number of rows affected by to_sql. None is returned if the callable 2410s passed into ``method`` does not return an integer number of rows. 2410s 2410s .. versionadded:: 1.4.0 2410s 2410s Notes 2410s ----- 2410s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2410s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2410s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2410s rows as stipulated in the 2410s `sqlite3 `__ or 2410s `SQLAlchemy `__ 2410s """ # noqa: E501 2410s if if_exists not in ("fail", "replace", "append"): 2410s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2410s 2410s if isinstance(frame, Series): 2410s frame = frame.to_frame() 2410s elif not isinstance(frame, DataFrame): 2410s raise NotImplementedError( 2410s "'frame' argument should be either a Series or a DataFrame" 2410s ) 2410s 2410s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def pandasSQL_builder( 2410s con, 2410s schema: str | None = None, 2410s need_transaction: bool = False, 2410s ) -> PandasSQL: 2410s """ 2410s Convenience function to return the correct PandasSQL subclass based on the 2410s provided parameters. Also creates a sqlalchemy connection and transaction 2410s if necessary. 2410s """ 2410s import sqlite3 2410s 2410s if isinstance(con, sqlite3.Connection) or con is None: 2410s return SQLiteDatabase(con) 2410s 2410s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2410s 2410s if isinstance(con, str) and sqlalchemy is None: 2410s raise ImportError("Using URI string without sqlalchemy installed.") 2410s 2410s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2410s > return SQLDatabase(con, schema, need_transaction) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def __init__( 2410s self, con, schema: str | None = None, need_transaction: bool = False 2410s ) -> None: 2410s from sqlalchemy import create_engine 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.schema import MetaData 2410s 2410s # self.exit_stack cleans up the Engine and Connection and commits the 2410s # transaction if any of those objects was created below. 2410s # Cleanup happens either in self.__exit__ or at the end of the iterator 2410s # returned by read_sql when chunksize is not None. 2410s self.exit_stack = ExitStack() 2410s if isinstance(con, str): 2410s con = create_engine(con) 2410s self.exit_stack.callback(con.dispose) 2410s if isinstance(con, Engine): 2410s > con = self.exit_stack.enter_context(con.connect()) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____________ test_dataframe_to_sql_empty[postgresql_psycopg2_conn] _____________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_dataframe_to_sql_empty(conn, test_frame1, request): 2410s if conn == "postgresql_adbc_conn": 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="postgres ADBC driver cannot insert index with null type", 2410s strict=True, 2410s ) 2410s ) 2410s # GH 51086 if conn is sqlite_engine 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:996: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____________________ test_to_sql[None-mysql_pymysql_engine] ____________________ 2410s conn = 'mysql_pymysql_engine', method = None 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("method", [None, "multi"]) 2410s def test_to_sql(conn, method, test_frame1, request): 2410s if method == "multi" and "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'method' not implemented for ADBC drivers", strict=True 2410s ) 2410s ) 2410s 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1060: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea576150>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea576150> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _____________________ test_to_sql[None-mysql_pymysql_conn] _____________________ 2410s conn = 'mysql_pymysql_conn', method = None 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("method", [None, "multi"]) 2410s def test_to_sql(conn, method, test_frame1, request): 2410s if method == "multi" and "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'method' not implemented for ADBC drivers", strict=True 2410s ) 2410s ) 2410s 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1060: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea576270>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea576270> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _________________ test_to_sql[None-postgresql_psycopg2_engine] _________________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s method = None 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("method", [None, "multi"]) 2410s def test_to_sql(conn, method, test_frame1, request): 2410s if method == "multi" and "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'method' not implemented for ADBC drivers", strict=True 2410s ) 2410s ) 2410s 2410s conn = request.getfixturevalue(conn) 2410s > with pandasSQL_builder(conn, need_transaction=True) as pandasSQL: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1061: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def pandasSQL_builder( 2410s con, 2410s schema: str | None = None, 2410s need_transaction: bool = False, 2410s ) -> PandasSQL: 2410s """ 2410s Convenience function to return the correct PandasSQL subclass based on the 2410s provided parameters. Also creates a sqlalchemy connection and transaction 2410s if necessary. 2410s """ 2410s import sqlite3 2410s 2410s if isinstance(con, sqlite3.Connection) or con is None: 2410s return SQLiteDatabase(con) 2410s 2410s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2410s 2410s if isinstance(con, str) and sqlalchemy is None: 2410s raise ImportError("Using URI string without sqlalchemy installed.") 2410s 2410s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2410s > return SQLDatabase(con, schema, need_transaction) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def __init__( 2410s self, con, schema: str | None = None, need_transaction: bool = False 2410s ) -> None: 2410s from sqlalchemy import create_engine 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.schema import MetaData 2410s 2410s # self.exit_stack cleans up the Engine and Connection and commits the 2410s # transaction if any of those objects was created below. 2410s # Cleanup happens either in self.__exit__ or at the end of the iterator 2410s # returned by read_sql when chunksize is not None. 2410s self.exit_stack = ExitStack() 2410s if isinstance(con, str): 2410s con = create_engine(con) 2410s self.exit_stack.callback(con.dispose) 2410s if isinstance(con, Engine): 2410s > con = self.exit_stack.enter_context(con.connect()) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __________________ test_to_sql[None-postgresql_psycopg2_conn] __________________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn', method = None 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("method", [None, "multi"]) 2410s def test_to_sql(conn, method, test_frame1, request): 2410s if method == "multi" and "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'method' not implemented for ADBC drivers", strict=True 2410s ) 2410s ) 2410s 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1060: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ___________________ test_to_sql[multi-mysql_pymysql_engine] ____________________ 2410s conn = 'mysql_pymysql_engine', method = 'multi' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("method", [None, "multi"]) 2410s def test_to_sql(conn, method, test_frame1, request): 2410s if method == "multi" and "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'method' not implemented for ADBC drivers", strict=True 2410s ) 2410s ) 2410s 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1060: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea576c90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea576c90> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ____________________ test_to_sql[multi-mysql_pymysql_conn] _____________________ 2410s conn = 'mysql_pymysql_conn', method = 'multi' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("method", [None, "multi"]) 2410s def test_to_sql(conn, method, test_frame1, request): 2410s if method == "multi" and "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'method' not implemented for ADBC drivers", strict=True 2410s ) 2410s ) 2410s 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1060: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea576db0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea576db0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ________________ test_to_sql[multi-postgresql_psycopg2_engine] _________________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s method = 'multi' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("method", [None, "multi"]) 2410s def test_to_sql(conn, method, test_frame1, request): 2410s if method == "multi" and "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'method' not implemented for ADBC drivers", strict=True 2410s ) 2410s ) 2410s 2410s conn = request.getfixturevalue(conn) 2410s > with pandasSQL_builder(conn, need_transaction=True) as pandasSQL: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1061: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def pandasSQL_builder( 2410s con, 2410s schema: str | None = None, 2410s need_transaction: bool = False, 2410s ) -> PandasSQL: 2410s """ 2410s Convenience function to return the correct PandasSQL subclass based on the 2410s provided parameters. Also creates a sqlalchemy connection and transaction 2410s if necessary. 2410s """ 2410s import sqlite3 2410s 2410s if isinstance(con, sqlite3.Connection) or con is None: 2410s return SQLiteDatabase(con) 2410s 2410s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2410s 2410s if isinstance(con, str) and sqlalchemy is None: 2410s raise ImportError("Using URI string without sqlalchemy installed.") 2410s 2410s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2410s > return SQLDatabase(con, schema, need_transaction) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def __init__( 2410s self, con, schema: str | None = None, need_transaction: bool = False 2410s ) -> None: 2410s from sqlalchemy import create_engine 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.schema import MetaData 2410s 2410s # self.exit_stack cleans up the Engine and Connection and commits the 2410s # transaction if any of those objects was created below. 2410s # Cleanup happens either in self.__exit__ or at the end of the iterator 2410s # returned by read_sql when chunksize is not None. 2410s self.exit_stack = ExitStack() 2410s if isinstance(con, str): 2410s con = create_engine(con) 2410s self.exit_stack.callback(con.dispose) 2410s if isinstance(con, Engine): 2410s > con = self.exit_stack.enter_context(con.connect()) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _________________ test_to_sql[multi-postgresql_psycopg2_conn] __________________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn', method = 'multi' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("method", [None, "multi"]) 2410s def test_to_sql(conn, method, test_frame1, request): 2410s if method == "multi" and "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'method' not implemented for ADBC drivers", strict=True 2410s ) 2410s ) 2410s 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1060: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______________ test_to_sql_exist[replace-1-mysql_pymysql_engine] _______________ 2410s conn = 'mysql_pymysql_engine', mode = 'replace', num_row_coef = 1 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("mode, num_row_coef", [("replace", 1), ("append", 2)]) 2410s def test_to_sql_exist(conn, mode, num_row_coef, test_frame1, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1070: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea577770>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea577770> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _______________ test_to_sql_exist[replace-1-mysql_pymysql_conn] ________________ 2410s conn = 'mysql_pymysql_conn', mode = 'replace', num_row_coef = 1 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("mode, num_row_coef", [("replace", 1), ("append", 2)]) 2410s def test_to_sql_exist(conn, mode, num_row_coef, test_frame1, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1070: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea577890>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea577890> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ___________ test_to_sql_exist[replace-1-postgresql_psycopg2_engine] ____________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s mode = 'replace', num_row_coef = 1 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("mode, num_row_coef", [("replace", 1), ("append", 2)]) 2410s def test_to_sql_exist(conn, mode, num_row_coef, test_frame1, request): 2410s conn = request.getfixturevalue(conn) 2410s > with pandasSQL_builder(conn, need_transaction=True) as pandasSQL: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1071: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def pandasSQL_builder( 2410s con, 2410s schema: str | None = None, 2410s need_transaction: bool = False, 2410s ) -> PandasSQL: 2410s """ 2410s Convenience function to return the correct PandasSQL subclass based on the 2410s provided parameters. Also creates a sqlalchemy connection and transaction 2410s if necessary. 2410s """ 2410s import sqlite3 2410s 2410s if isinstance(con, sqlite3.Connection) or con is None: 2410s return SQLiteDatabase(con) 2410s 2410s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2410s 2410s if isinstance(con, str) and sqlalchemy is None: 2410s raise ImportError("Using URI string without sqlalchemy installed.") 2410s 2410s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2410s > return SQLDatabase(con, schema, need_transaction) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def __init__( 2410s self, con, schema: str | None = None, need_transaction: bool = False 2410s ) -> None: 2410s from sqlalchemy import create_engine 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.schema import MetaData 2410s 2410s # self.exit_stack cleans up the Engine and Connection and commits the 2410s # transaction if any of those objects was created below. 2410s # Cleanup happens either in self.__exit__ or at the end of the iterator 2410s # returned by read_sql when chunksize is not None. 2410s self.exit_stack = ExitStack() 2410s if isinstance(con, str): 2410s con = create_engine(con) 2410s self.exit_stack.callback(con.dispose) 2410s if isinstance(con, Engine): 2410s > con = self.exit_stack.enter_context(con.connect()) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____________ test_to_sql_exist[replace-1-postgresql_psycopg2_conn] _____________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn', mode = 'replace', num_row_coef = 1 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("mode, num_row_coef", [("replace", 1), ("append", 2)]) 2410s def test_to_sql_exist(conn, mode, num_row_coef, test_frame1, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1070: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _______________ test_to_sql_exist[append-2-mysql_pymysql_engine] _______________ 2410s conn = 'mysql_pymysql_engine', mode = 'append', num_row_coef = 2 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("mode, num_row_coef", [("replace", 1), ("append", 2)]) 2410s def test_to_sql_exist(conn, mode, num_row_coef, test_frame1, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1070: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea577dd0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea577dd0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ________________ test_to_sql_exist[append-2-mysql_pymysql_conn] ________________ 2410s conn = 'mysql_pymysql_conn', mode = 'append', num_row_coef = 2 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("mode, num_row_coef", [("replace", 1), ("append", 2)]) 2410s def test_to_sql_exist(conn, mode, num_row_coef, test_frame1, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1070: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea577ef0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea577ef0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ____________ test_to_sql_exist[append-2-postgresql_psycopg2_engine] ____________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s mode = 'append', num_row_coef = 2 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("mode, num_row_coef", [("replace", 1), ("append", 2)]) 2410s def test_to_sql_exist(conn, mode, num_row_coef, test_frame1, request): 2410s conn = request.getfixturevalue(conn) 2410s > with pandasSQL_builder(conn, need_transaction=True) as pandasSQL: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1071: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def pandasSQL_builder( 2410s con, 2410s schema: str | None = None, 2410s need_transaction: bool = False, 2410s ) -> PandasSQL: 2410s """ 2410s Convenience function to return the correct PandasSQL subclass based on the 2410s provided parameters. Also creates a sqlalchemy connection and transaction 2410s if necessary. 2410s """ 2410s import sqlite3 2410s 2410s if isinstance(con, sqlite3.Connection) or con is None: 2410s return SQLiteDatabase(con) 2410s 2410s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2410s 2410s if isinstance(con, str) and sqlalchemy is None: 2410s raise ImportError("Using URI string without sqlalchemy installed.") 2410s 2410s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2410s > return SQLDatabase(con, schema, need_transaction) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def __init__( 2410s self, con, schema: str | None = None, need_transaction: bool = False 2410s ) -> None: 2410s from sqlalchemy import create_engine 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.schema import MetaData 2410s 2410s # self.exit_stack cleans up the Engine and Connection and commits the 2410s # transaction if any of those objects was created below. 2410s # Cleanup happens either in self.__exit__ or at the end of the iterator 2410s # returned by read_sql when chunksize is not None. 2410s self.exit_stack = ExitStack() 2410s if isinstance(con, str): 2410s con = create_engine(con) 2410s self.exit_stack.callback(con.dispose) 2410s if isinstance(con, Engine): 2410s > con = self.exit_stack.enter_context(con.connect()) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _____________ test_to_sql_exist[append-2-postgresql_psycopg2_conn] _____________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn', mode = 'append', num_row_coef = 2 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s @pytest.mark.parametrize("mode, num_row_coef", [("replace", 1), ("append", 2)]) 2410s def test_to_sql_exist(conn, mode, num_row_coef, test_frame1, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1070: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _________________ test_to_sql_exist_fail[mysql_pymysql_engine] _________________ 2410s conn = 'mysql_pymysql_engine' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_to_sql_exist_fail(conn, test_frame1, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1080: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c0e30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c0e30> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s __________________ test_to_sql_exist_fail[mysql_pymysql_conn] __________________ 2410s conn = 'mysql_pymysql_conn' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_to_sql_exist_fail(conn, test_frame1, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1080: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c0f50>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c0f50> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ______________ test_to_sql_exist_fail[postgresql_psycopg2_engine] ______________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_to_sql_exist_fail(conn, test_frame1, request): 2410s conn = request.getfixturevalue(conn) 2410s > with pandasSQL_builder(conn, need_transaction=True) as pandasSQL: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1081: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def pandasSQL_builder( 2410s con, 2410s schema: str | None = None, 2410s need_transaction: bool = False, 2410s ) -> PandasSQL: 2410s """ 2410s Convenience function to return the correct PandasSQL subclass based on the 2410s provided parameters. Also creates a sqlalchemy connection and transaction 2410s if necessary. 2410s """ 2410s import sqlite3 2410s 2410s if isinstance(con, sqlite3.Connection) or con is None: 2410s return SQLiteDatabase(con) 2410s 2410s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2410s 2410s if isinstance(con, str) and sqlalchemy is None: 2410s raise ImportError("Using URI string without sqlalchemy installed.") 2410s 2410s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2410s > return SQLDatabase(con, schema, need_transaction) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def __init__( 2410s self, con, schema: str | None = None, need_transaction: bool = False 2410s ) -> None: 2410s from sqlalchemy import create_engine 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.schema import MetaData 2410s 2410s # self.exit_stack cleans up the Engine and Connection and commits the 2410s # transaction if any of those objects was created below. 2410s # Cleanup happens either in self.__exit__ or at the end of the iterator 2410s # returned by read_sql when chunksize is not None. 2410s self.exit_stack = ExitStack() 2410s if isinstance(con, str): 2410s con = create_engine(con) 2410s self.exit_stack.callback(con.dispose) 2410s if isinstance(con, Engine): 2410s > con = self.exit_stack.enter_context(con.connect()) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _______________ test_to_sql_exist_fail[postgresql_psycopg2_conn] _______________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_to_sql_exist_fail(conn, test_frame1, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1080: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _______________ test_read_iris_query[mysql_pymysql_engine_iris] ________________ 2410s conn = 'mysql_pymysql_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_iris_query(conn, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1092: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c1490>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c1490> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ________________ test_read_iris_query[mysql_pymysql_conn_iris] _________________ 2410s conn = 'mysql_pymysql_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_iris_query(conn, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1092: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c15b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c15b0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ____________ test_read_iris_query[postgresql_psycopg2_engine_iris] _____________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_iris_query(conn, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1092: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _____________ test_read_iris_query[postgresql_psycopg2_conn_iris] ______________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_iris_query(conn, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1092: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __________ test_read_iris_query_chunksize[mysql_pymysql_engine_iris] ___________ 2410s conn = 'mysql_pymysql_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_iris_query_chunksize(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'chunksize' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1111: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c1fd0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c1fd0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ___________ test_read_iris_query_chunksize[mysql_pymysql_conn_iris] ____________ 2410s conn = 'mysql_pymysql_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_iris_query_chunksize(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'chunksize' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1111: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c2090>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c2090> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _______ test_read_iris_query_chunksize[postgresql_psycopg2_engine_iris] ________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_iris_query_chunksize(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'chunksize' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1111: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque...uest 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque...uest 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ________ test_read_iris_query_chunksize[postgresql_psycopg2_conn_iris] _________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_iris_query_chunksize(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'chunksize' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1111: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque...equest 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque...equest 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __ test_read_iris_query_expression_with_parameter[mysql_pymysql_engine_iris] ___ 2410s conn = 'mysql_pymysql_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2410s def test_read_iris_query_expression_with_parameter(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'chunksize' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1130: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c29f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c29f0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ___ test_read_iris_query_expression_with_parameter[mysql_pymysql_conn_iris] ____ 2410s conn = 'mysql_pymysql_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2410s def test_read_iris_query_expression_with_parameter(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'chunksize' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1130: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c2ab0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c2ab0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _ test_read_iris_query_expression_with_parameter[postgresql_psycopg2_engine_iris] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2410s def test_read_iris_query_expression_with_parameter(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'chunksize' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1130: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque..._psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque..._psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ test_read_iris_query_expression_with_parameter[postgresql_psycopg2_conn_iris] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2410s def test_read_iris_query_expression_with_parameter(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'chunksize' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1130: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque...ql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque...ql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____ test_read_iris_query_string_with_parameter[mysql_pymysql_engine_iris] _____ 2410s conn = 'mysql_pymysql_engine_iris' 2410s request = > 2410s sql_strings = {'read_named_parameters': {'mysql': '\n SELECT * FROM iris WHERE\n `Name`=%(name)s AND `...LECT * FROM iris WHERE "Name"=%s AND "SepalLength"=%s', 'sqlite': 'SELECT * FROM iris WHERE Name=? AND SepalLength=?'}} 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_iris_query_string_with_parameter(conn, request, sql_strings): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'chunksize' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s 2410s for db, query in sql_strings["read_parameters"].items(): 2410s if db in conn: 2410s break 2410s else: 2410s raise KeyError(f"No part of {conn} found in sql_strings['read_parameters']") 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1164: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c2f30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c2f30> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _____ test_read_iris_query_string_with_parameter[mysql_pymysql_conn_iris] ______ 2410s conn = 'mysql_pymysql_conn_iris' 2410s request = > 2410s sql_strings = {'read_named_parameters': {'mysql': '\n SELECT * FROM iris WHERE\n `Name`=%(name)s AND `...LECT * FROM iris WHERE "Name"=%s AND "SepalLength"=%s', 'sqlite': 'SELECT * FROM iris WHERE Name=? AND SepalLength=?'}} 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_iris_query_string_with_parameter(conn, request, sql_strings): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'chunksize' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s 2410s for db, query in sql_strings["read_parameters"].items(): 2410s if db in conn: 2410s break 2410s else: 2410s raise KeyError(f"No part of {conn} found in sql_strings['read_parameters']") 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1164: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c2ff0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c2ff0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _ test_read_iris_query_string_with_parameter[postgresql_psycopg2_engine_iris] __ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_engine_iris' 2410s request = > 2410s sql_strings = {'read_named_parameters': {'mysql': '\n SELECT * FROM iris WHERE\n `Name`=%(name)s AND `...LECT * FROM iris WHERE "Name"=%s AND "SepalLength"=%s', 'sqlite': 'SELECT * FROM iris WHERE Name=? AND SepalLength=?'}} 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_iris_query_string_with_parameter(conn, request, sql_strings): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'chunksize' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s 2410s for db, query in sql_strings["read_parameters"].items(): 2410s if db in conn: 2410s break 2410s else: 2410s raise KeyError(f"No part of {conn} found in sql_strings['read_parameters']") 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1164: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque...esql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque...esql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __ test_read_iris_query_string_with_parameter[postgresql_psycopg2_conn_iris] ___ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn_iris' 2410s request = > 2410s sql_strings = {'read_named_parameters': {'mysql': '\n SELECT * FROM iris WHERE\n `Name`=%(name)s AND `...LECT * FROM iris WHERE "Name"=%s AND "SepalLength"=%s', 'sqlite': 'SELECT * FROM iris WHERE Name=? AND SepalLength=?'}} 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_iris_query_string_with_parameter(conn, request, sql_strings): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'chunksize' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s 2410s for db, query in sql_strings["read_parameters"].items(): 2410s if db in conn: 2410s break 2410s else: 2410s raise KeyError(f"No part of {conn} found in sql_strings['read_parameters']") 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1164: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque...gresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque...gresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _______________ test_read_iris_table[mysql_pymysql_engine_iris] ________________ 2410s conn = 'mysql_pymysql_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2410s def test_read_iris_table(conn, request): 2410s # GH 51015 if conn = sqlite_iris_str 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1172: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c36b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c36b0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ________________ test_read_iris_table[mysql_pymysql_conn_iris] _________________ 2410s conn = 'mysql_pymysql_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2410s def test_read_iris_table(conn, request): 2410s # GH 51015 if conn = sqlite_iris_str 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1172: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c37d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c37d0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ____________ test_read_iris_table[postgresql_psycopg2_engine_iris] _____________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2410s def test_read_iris_table(conn, request): 2410s # GH 51015 if conn = sqlite_iris_str 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1172: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _____________ test_read_iris_table[postgresql_psycopg2_conn_iris] ______________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2410s def test_read_iris_table(conn, request): 2410s # GH 51015 if conn = sqlite_iris_str 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1172: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __________ test_read_iris_table_chunksize[mysql_pymysql_engine_iris] ___________ 2410s conn = 'mysql_pymysql_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2410s def test_read_iris_table_chunksize(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1185: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c3e90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c3e90> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ___________ test_read_iris_table_chunksize[mysql_pymysql_conn_iris] ____________ 2410s conn = 'mysql_pymysql_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2410s def test_read_iris_table_chunksize(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1185: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c3e90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4c3e90> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _______ test_read_iris_table_chunksize[postgresql_psycopg2_engine_iris] ________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2410s def test_read_iris_table_chunksize(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1185: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque...uest 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque...uest 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ________ test_read_iris_table_chunksize[postgresql_psycopg2_conn_iris] _________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2410s def test_read_iris_table_chunksize(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1185: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque...equest 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque...equest 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __________________ test_to_sql_callable[mysql_pymysql_engine] __________________ 2410s conn = 'mysql_pymysql_engine' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2410s def test_to_sql_callable(conn, test_frame1, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1194: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4e8a70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4e8a70> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ___________________ test_to_sql_callable[mysql_pymysql_conn] ___________________ 2410s conn = 'mysql_pymysql_conn' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2410s def test_to_sql_callable(conn, test_frame1, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1194: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4e8ad0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4e8ad0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _______________ test_to_sql_callable[postgresql_psycopg2_engine] _______________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2410s def test_to_sql_callable(conn, test_frame1, request): 2410s conn = request.getfixturevalue(conn) 2410s 2410s check = [] # used to double check function below is really being used 2410s 2410s def sample(pd_table, conn, keys, data_iter): 2410s check.append(1) 2410s data = [dict(zip(keys, row)) for row in data_iter] 2410s conn.execute(pd_table.table.insert(), data) 2410s 2410s > with pandasSQL_builder(conn, need_transaction=True) as pandasSQL: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1203: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def pandasSQL_builder( 2410s con, 2410s schema: str | None = None, 2410s need_transaction: bool = False, 2410s ) -> PandasSQL: 2410s """ 2410s Convenience function to return the correct PandasSQL subclass based on the 2410s provided parameters. Also creates a sqlalchemy connection and transaction 2410s if necessary. 2410s """ 2410s import sqlite3 2410s 2410s if isinstance(con, sqlite3.Connection) or con is None: 2410s return SQLiteDatabase(con) 2410s 2410s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2410s 2410s if isinstance(con, str) and sqlalchemy is None: 2410s raise ImportError("Using URI string without sqlalchemy installed.") 2410s 2410s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2410s > return SQLDatabase(con, schema, need_transaction) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def __init__( 2410s self, con, schema: str | None = None, need_transaction: bool = False 2410s ) -> None: 2410s from sqlalchemy import create_engine 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.schema import MetaData 2410s 2410s # self.exit_stack cleans up the Engine and Connection and commits the 2410s # transaction if any of those objects was created below. 2410s # Cleanup happens either in self.__exit__ or at the end of the iterator 2410s # returned by read_sql when chunksize is not None. 2410s self.exit_stack = ExitStack() 2410s if isinstance(con, str): 2410s con = create_engine(con) 2410s self.exit_stack.callback(con.dispose) 2410s if isinstance(con, Engine): 2410s > con = self.exit_stack.enter_context(con.connect()) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ________________ test_to_sql_callable[postgresql_psycopg2_conn] ________________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn' 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2410s def test_to_sql_callable(conn, test_frame1, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1194: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ___________ test_default_type_conversion[mysql_pymysql_engine_types] ___________ 2410s conn = 'mysql_pymysql_engine_types' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_types) 2410s def test_default_type_conversion(conn, request): 2410s conn_name = conn 2410s if conn_name == "sqlite_buildin_types": 2410s request.applymarker( 2410s pytest.mark.xfail( 2410s reason="sqlite_buildin connection does not implement read_sql_table" 2410s ) 2410s ) 2410s 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1220: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_types' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_types' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4e8d70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4e8d70> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ____________ test_default_type_conversion[mysql_pymysql_conn_types] ____________ 2410s conn = 'mysql_pymysql_conn_types' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_types) 2410s def test_default_type_conversion(conn, request): 2410s conn_name = conn 2410s if conn_name == "sqlite_buildin_types": 2410s request.applymarker( 2410s pytest.mark.xfail( 2410s reason="sqlite_buildin connection does not implement read_sql_table" 2410s ) 2410s ) 2410s 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1220: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_types' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_types' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_types' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4e8e30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4e8e30> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ________ test_default_type_conversion[postgresql_psycopg2_engine_types] ________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_engine_types' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_types) 2410s def test_default_type_conversion(conn, request): 2410s conn_name = conn 2410s if conn_name == "sqlite_buildin_types": 2410s request.applymarker( 2410s pytest.mark.xfail( 2410s reason="sqlite_buildin connection does not implement read_sql_table" 2410s ) 2410s ) 2410s 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1220: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_types' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_types' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'requ...uest 'postgresql_psycopg2_engine_types' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'requ...uest 'postgresql_psycopg2_engine_types' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2410s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2410s dialect = 'postgres' 2410s 2410s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2410s from sqlalchemy import insert 2410s from sqlalchemy.engine import Engine 2410s 2410s types = types_table_metadata(dialect) 2410s 2410s stmt = insert(types).values(types_data) 2410s if isinstance(conn, Engine): 2410s > with conn.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _________ test_default_type_conversion[postgresql_psycopg2_conn_types] _________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn_types' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_types) 2410s def test_default_type_conversion(conn, request): 2410s conn_name = conn 2410s if conn_name == "sqlite_buildin_types": 2410s request.applymarker( 2410s pytest.mark.xfail( 2410s reason="sqlite_buildin connection does not implement read_sql_table" 2410s ) 2410s ) 2410s 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1220: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_types' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_types' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_types' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'requ...equest 'postgresql_psycopg2_engine_types' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'requ...equest 'postgresql_psycopg2_engine_types' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2410s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2410s dialect = 'postgres' 2410s 2410s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2410s from sqlalchemy import insert 2410s from sqlalchemy.engine import Engine 2410s 2410s types = types_table_metadata(dialect) 2410s 2410s stmt = insert(types).values(types_data) 2410s if isinstance(conn, Engine): 2410s > with conn.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __________________ test_read_procedure[mysql_pymysql_engine] ___________________ 2410s conn = 'mysql_pymysql_engine' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", mysql_connectable) 2410s def test_read_procedure(conn, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1244: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4e9790>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4e9790> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ___________________ test_read_procedure[mysql_pymysql_conn] ____________________ 2410s conn = 'mysql_pymysql_conn' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", mysql_connectable) 2410s def test_read_procedure(conn, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1244: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4e97f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4e97f0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ____ test_copy_from_callable_insertion_method[2-postgresql_psycopg2_engine] ____ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s expected_count = 2 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", postgresql_connectable) 2410s @pytest.mark.parametrize("expected_count", [2, "Success!"]) 2410s def test_copy_from_callable_insertion_method(conn, expected_count, request): 2410s # GH 8953 2410s # Example in io.rst found under _io.sql.method 2410s # not available in sqlite, mysql 2410s def psql_insert_copy(table, conn, keys, data_iter): 2410s # gets a DBAPI connection that can provide a cursor 2410s dbapi_conn = conn.connection 2410s with dbapi_conn.cursor() as cur: 2410s s_buf = StringIO() 2410s writer = csv.writer(s_buf) 2410s writer.writerows(data_iter) 2410s s_buf.seek(0) 2410s 2410s columns = ", ".join([f'"{k}"' for k in keys]) 2410s if table.schema: 2410s table_name = f"{table.schema}.{table.name}" 2410s else: 2410s table_name = table.name 2410s 2410s sql_query = f"COPY {table_name} ({columns}) FROM STDIN WITH CSV" 2410s cur.copy_expert(sql=sql_query, file=s_buf) 2410s return expected_count 2410s 2410s conn = request.getfixturevalue(conn) 2410s expected = DataFrame({"col1": [1, 2], "col2": [0.1, 0.2], "col3": ["a", "n"]}) 2410s > result_count = expected.to_sql( 2410s name="test_frame", con=conn, index=False, method=psql_insert_copy 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1306: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ( col1 col2 col3 2410s 0 1 0.1 a 2410s 1 2 0.2 n,) 2410s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'method': .psql_insert_copy at 0x79e9eafa9f80>, 'name': 'test_frame'} 2410s 2410s @wraps(func) 2410s def wrapper(*args, **kwargs): 2410s if len(args) > num_allow_args: 2410s warnings.warn( 2410s msg.format(arguments=_format_argument_list(allow_args)), 2410s FutureWarning, 2410s stacklevel=find_stack_level(), 2410s ) 2410s > return func(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = col1 col2 col3 2410s 0 1 0.1 a 2410s 1 2 0.2 n 2410s name = 'test_frame' 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, if_exists = 'fail', index = False, index_label = None 2410s chunksize = None, dtype = None 2410s method = .psql_insert_copy at 0x79e9eafa9f80> 2410s 2410s @final 2410s @deprecate_nonkeyword_arguments( 2410s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2410s ) 2410s def to_sql( 2410s self, 2410s name: str, 2410s con, 2410s schema: str | None = None, 2410s if_exists: Literal["fail", "replace", "append"] = "fail", 2410s index: bool_t = True, 2410s index_label: IndexLabel | None = None, 2410s chunksize: int | None = None, 2410s dtype: DtypeArg | None = None, 2410s method: Literal["multi"] | Callable | None = None, 2410s ) -> int | None: 2410s """ 2410s Write records stored in a DataFrame to a SQL database. 2410s 2410s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2410s newly created, appended to, or overwritten. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s Name of SQL table. 2410s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2410s Using SQLAlchemy makes it possible to use any DB supported by that 2410s library. Legacy support is provided for sqlite3.Connection objects. The user 2410s is responsible for engine disposal and connection closure for the SQLAlchemy 2410s connectable. See `here \ 2410s `_. 2410s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2410s the transaction will not be committed. If passing a sqlite3.Connection, 2410s it will not be possible to roll back the record insertion. 2410s 2410s schema : str, optional 2410s Specify the schema (if database flavor supports this). If None, use 2410s default schema. 2410s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2410s How to behave if the table already exists. 2410s 2410s * fail: Raise a ValueError. 2410s * replace: Drop the table before inserting new values. 2410s * append: Insert new values to the existing table. 2410s 2410s index : bool, default True 2410s Write DataFrame index as a column. Uses `index_label` as the column 2410s name in the table. Creates a table index for this column. 2410s index_label : str or sequence, default None 2410s Column label for index column(s). If None is given (default) and 2410s `index` is True, then the index names are used. 2410s A sequence should be given if the DataFrame uses MultiIndex. 2410s chunksize : int, optional 2410s Specify the number of rows in each batch to be written at a time. 2410s By default, all rows will be written at once. 2410s dtype : dict or scalar, optional 2410s Specifying the datatype for columns. If a dictionary is used, the 2410s keys should be the column names and the values should be the 2410s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2410s scalar is provided, it will be applied to all columns. 2410s method : {None, 'multi', callable}, optional 2410s Controls the SQL insertion clause used: 2410s 2410s * None : Uses standard SQL ``INSERT`` clause (one per row). 2410s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2410s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2410s 2410s Details and a sample callable implementation can be found in the 2410s section :ref:`insert method `. 2410s 2410s Returns 2410s ------- 2410s None or int 2410s Number of rows affected by to_sql. None is returned if the callable 2410s passed into ``method`` does not return an integer number of rows. 2410s 2410s The number of returned rows affected is the sum of the ``rowcount`` 2410s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2410s reflect the exact number of written rows as stipulated in the 2410s `sqlite3 `__ or 2410s `SQLAlchemy `__. 2410s 2410s .. versionadded:: 1.4.0 2410s 2410s Raises 2410s ------ 2410s ValueError 2410s When the table already exists and `if_exists` is 'fail' (the 2410s default). 2410s 2410s See Also 2410s -------- 2410s read_sql : Read a DataFrame from a table. 2410s 2410s Notes 2410s ----- 2410s Timezone aware datetime columns will be written as 2410s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2410s database. Otherwise, the datetimes will be stored as timezone unaware 2410s timestamps local to the original timezone. 2410s 2410s Not all datastores support ``method="multi"``. Oracle, for example, 2410s does not support multi-value insert. 2410s 2410s References 2410s ---------- 2410s .. [1] https://docs.sqlalchemy.org 2410s .. [2] https://www.python.org/dev/peps/pep-0249/ 2410s 2410s Examples 2410s -------- 2410s Create an in-memory SQLite database. 2410s 2410s >>> from sqlalchemy import create_engine 2410s >>> engine = create_engine('sqlite://', echo=False) 2410s 2410s Create a table from scratch with 3 rows. 2410s 2410s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2410s >>> df 2410s name 2410s 0 User 1 2410s 1 User 2 2410s 2 User 3 2410s 2410s >>> df.to_sql(name='users', con=engine) 2410s 3 2410s >>> from sqlalchemy import text 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2410s 2410s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2410s 2410s >>> with engine.begin() as connection: 2410s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2410s ... df1.to_sql(name='users', con=connection, if_exists='append') 2410s 2 2410s 2410s This is allowed to support operations that require that the same 2410s DBAPI connection is used for the entire operation. 2410s 2410s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2410s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2410s 2 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2410s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2410s (1, 'User 7')] 2410s 2410s Overwrite the table with just ``df2``. 2410s 2410s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2410s ... index_label='id') 2410s 2 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 6'), (1, 'User 7')] 2410s 2410s Use ``method`` to define a callable insertion method to do nothing 2410s if there's a primary key conflict on a table in a PostgreSQL database. 2410s 2410s >>> from sqlalchemy.dialects.postgresql import insert 2410s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2410s ... # "a" is the primary key in "conflict_table" 2410s ... data = [dict(zip(keys, row)) for row in data_iter] 2410s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2410s ... result = conn.execute(stmt) 2410s ... return result.rowcount 2410s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2410s 0 2410s 2410s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2410s on a primary key. 2410s 2410s >>> from sqlalchemy.dialects.mysql import insert 2410s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2410s ... # update columns "b" and "c" on primary key conflict 2410s ... data = [dict(zip(keys, row)) for row in data_iter] 2410s ... stmt = ( 2410s ... insert(table.table) 2410s ... .values(data) 2410s ... ) 2410s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2410s ... result = conn.execute(stmt) 2410s ... return result.rowcount 2410s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2410s 2 2410s 2410s Specify the dtype (especially useful for integers with missing values). 2410s Notice that while pandas is forced to store the data as floating point, 2410s the database supports nullable integers. When fetching the data with 2410s Python, we get back integer scalars. 2410s 2410s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2410s >>> df 2410s A 2410s 0 1.0 2410s 1 NaN 2410s 2 2.0 2410s 2410s >>> from sqlalchemy.types import Integer 2410s >>> df.to_sql(name='integers', con=engine, index=False, 2410s ... dtype={"A": Integer()}) 2410s 3 2410s 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2410s [(1,), (None,), (2,)] 2410s """ # noqa: E501 2410s from pandas.io import sql 2410s 2410s > return sql.to_sql( 2410s self, 2410s name, 2410s con, 2410s schema=schema, 2410s if_exists=if_exists, 2410s index=index, 2410s index_label=index_label, 2410s chunksize=chunksize, 2410s dtype=dtype, 2410s method=method, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s frame = col1 col2 col3 2410s 0 1 0.1 a 2410s 1 2 0.2 n 2410s name = 'test_frame' 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, if_exists = 'fail', index = False, index_label = None 2410s chunksize = None, dtype = None 2410s method = .psql_insert_copy at 0x79e9eafa9f80> 2410s engine = 'auto', engine_kwargs = {} 2410s 2410s def to_sql( 2410s frame, 2410s name: str, 2410s con, 2410s schema: str | None = None, 2410s if_exists: Literal["fail", "replace", "append"] = "fail", 2410s index: bool = True, 2410s index_label: IndexLabel | None = None, 2410s chunksize: int | None = None, 2410s dtype: DtypeArg | None = None, 2410s method: Literal["multi"] | Callable | None = None, 2410s engine: str = "auto", 2410s **engine_kwargs, 2410s ) -> int | None: 2410s """ 2410s Write records stored in a DataFrame to a SQL database. 2410s 2410s Parameters 2410s ---------- 2410s frame : DataFrame, Series 2410s name : str 2410s Name of SQL table. 2410s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2410s or sqlite3 DBAPI2 connection 2410s ADBC provides high performance I/O with native type support, where available. 2410s Using SQLAlchemy makes it possible to use any DB supported by that 2410s library. 2410s If a DBAPI2 object, only sqlite3 is supported. 2410s schema : str, optional 2410s Name of SQL schema in database to write to (if database flavor 2410s supports this). If None, use default schema (default). 2410s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2410s - fail: If table exists, do nothing. 2410s - replace: If table exists, drop it, recreate it, and insert data. 2410s - append: If table exists, insert data. Create if does not exist. 2410s index : bool, default True 2410s Write DataFrame index as a column. 2410s index_label : str or sequence, optional 2410s Column label for index column(s). If None is given (default) and 2410s `index` is True, then the index names are used. 2410s A sequence should be given if the DataFrame uses MultiIndex. 2410s chunksize : int, optional 2410s Specify the number of rows in each batch to be written at a time. 2410s By default, all rows will be written at once. 2410s dtype : dict or scalar, optional 2410s Specifying the datatype for columns. If a dictionary is used, the 2410s keys should be the column names and the values should be the 2410s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2410s scalar is provided, it will be applied to all columns. 2410s method : {None, 'multi', callable}, optional 2410s Controls the SQL insertion clause used: 2410s 2410s - None : Uses standard SQL ``INSERT`` clause (one per row). 2410s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2410s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2410s 2410s Details and a sample callable implementation can be found in the 2410s section :ref:`insert method `. 2410s engine : {'auto', 'sqlalchemy'}, default 'auto' 2410s SQL engine library to use. If 'auto', then the option 2410s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2410s behavior is 'sqlalchemy' 2410s 2410s .. versionadded:: 1.3.0 2410s 2410s **engine_kwargs 2410s Any additional kwargs are passed to the engine. 2410s 2410s Returns 2410s ------- 2410s None or int 2410s Number of rows affected by to_sql. None is returned if the callable 2410s passed into ``method`` does not return an integer number of rows. 2410s 2410s .. versionadded:: 1.4.0 2410s 2410s Notes 2410s ----- 2410s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2410s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2410s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2410s rows as stipulated in the 2410s `sqlite3 `__ or 2410s `SQLAlchemy `__ 2410s """ # noqa: E501 2410s if if_exists not in ("fail", "replace", "append"): 2410s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2410s 2410s if isinstance(frame, Series): 2410s frame = frame.to_frame() 2410s elif not isinstance(frame, DataFrame): 2410s raise NotImplementedError( 2410s "'frame' argument should be either a Series or a DataFrame" 2410s ) 2410s 2410s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def pandasSQL_builder( 2410s con, 2410s schema: str | None = None, 2410s need_transaction: bool = False, 2410s ) -> PandasSQL: 2410s """ 2410s Convenience function to return the correct PandasSQL subclass based on the 2410s provided parameters. Also creates a sqlalchemy connection and transaction 2410s if necessary. 2410s """ 2410s import sqlite3 2410s 2410s if isinstance(con, sqlite3.Connection) or con is None: 2410s return SQLiteDatabase(con) 2410s 2410s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2410s 2410s if isinstance(con, str) and sqlalchemy is None: 2410s raise ImportError("Using URI string without sqlalchemy installed.") 2410s 2410s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2410s > return SQLDatabase(con, schema, need_transaction) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def __init__( 2410s self, con, schema: str | None = None, need_transaction: bool = False 2410s ) -> None: 2410s from sqlalchemy import create_engine 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.schema import MetaData 2410s 2410s # self.exit_stack cleans up the Engine and Connection and commits the 2410s # transaction if any of those objects was created below. 2410s # Cleanup happens either in self.__exit__ or at the end of the iterator 2410s # returned by read_sql when chunksize is not None. 2410s self.exit_stack = ExitStack() 2410s if isinstance(con, str): 2410s con = create_engine(con) 2410s self.exit_stack.callback(con.dispose) 2410s if isinstance(con, Engine): 2410s > con = self.exit_stack.enter_context(con.connect()) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _____ test_copy_from_callable_insertion_method[2-postgresql_psycopg2_conn] _____ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn', expected_count = 2 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", postgresql_connectable) 2410s @pytest.mark.parametrize("expected_count", [2, "Success!"]) 2410s def test_copy_from_callable_insertion_method(conn, expected_count, request): 2410s # GH 8953 2410s # Example in io.rst found under _io.sql.method 2410s # not available in sqlite, mysql 2410s def psql_insert_copy(table, conn, keys, data_iter): 2410s # gets a DBAPI connection that can provide a cursor 2410s dbapi_conn = conn.connection 2410s with dbapi_conn.cursor() as cur: 2410s s_buf = StringIO() 2410s writer = csv.writer(s_buf) 2410s writer.writerows(data_iter) 2410s s_buf.seek(0) 2410s 2410s columns = ", ".join([f'"{k}"' for k in keys]) 2410s if table.schema: 2410s table_name = f"{table.schema}.{table.name}" 2410s else: 2410s table_name = table.name 2410s 2410s sql_query = f"COPY {table_name} ({columns}) FROM STDIN WITH CSV" 2410s cur.copy_expert(sql=sql_query, file=s_buf) 2410s return expected_count 2410s 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1304: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ test_copy_from_callable_insertion_method[Success!-postgresql_psycopg2_engine] _ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s expected_count = 'Success!' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", postgresql_connectable) 2410s @pytest.mark.parametrize("expected_count", [2, "Success!"]) 2410s def test_copy_from_callable_insertion_method(conn, expected_count, request): 2410s # GH 8953 2410s # Example in io.rst found under _io.sql.method 2410s # not available in sqlite, mysql 2410s def psql_insert_copy(table, conn, keys, data_iter): 2410s # gets a DBAPI connection that can provide a cursor 2410s dbapi_conn = conn.connection 2410s with dbapi_conn.cursor() as cur: 2410s s_buf = StringIO() 2410s writer = csv.writer(s_buf) 2410s writer.writerows(data_iter) 2410s s_buf.seek(0) 2410s 2410s columns = ", ".join([f'"{k}"' for k in keys]) 2410s if table.schema: 2410s table_name = f"{table.schema}.{table.name}" 2410s else: 2410s table_name = table.name 2410s 2410s sql_query = f"COPY {table_name} ({columns}) FROM STDIN WITH CSV" 2410s cur.copy_expert(sql=sql_query, file=s_buf) 2410s return expected_count 2410s 2410s conn = request.getfixturevalue(conn) 2410s expected = DataFrame({"col1": [1, 2], "col2": [0.1, 0.2], "col3": ["a", "n"]}) 2410s > result_count = expected.to_sql( 2410s name="test_frame", con=conn, index=False, method=psql_insert_copy 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1306: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ( col1 col2 col3 2410s 0 1 0.1 a 2410s 1 2 0.2 n,) 2410s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'method': .psql_insert_copy at 0x79e9eafa9f80>, 'name': 'test_frame'} 2410s 2410s @wraps(func) 2410s def wrapper(*args, **kwargs): 2410s if len(args) > num_allow_args: 2410s warnings.warn( 2410s msg.format(arguments=_format_argument_list(allow_args)), 2410s FutureWarning, 2410s stacklevel=find_stack_level(), 2410s ) 2410s > return func(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = col1 col2 col3 2410s 0 1 0.1 a 2410s 1 2 0.2 n 2410s name = 'test_frame' 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, if_exists = 'fail', index = False, index_label = None 2410s chunksize = None, dtype = None 2410s method = .psql_insert_copy at 0x79e9eafa9f80> 2410s 2410s @final 2410s @deprecate_nonkeyword_arguments( 2410s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2410s ) 2410s def to_sql( 2410s self, 2410s name: str, 2410s con, 2410s schema: str | None = None, 2410s if_exists: Literal["fail", "replace", "append"] = "fail", 2410s index: bool_t = True, 2410s index_label: IndexLabel | None = None, 2410s chunksize: int | None = None, 2410s dtype: DtypeArg | None = None, 2410s method: Literal["multi"] | Callable | None = None, 2410s ) -> int | None: 2410s """ 2410s Write records stored in a DataFrame to a SQL database. 2410s 2410s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2410s newly created, appended to, or overwritten. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s Name of SQL table. 2410s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2410s Using SQLAlchemy makes it possible to use any DB supported by that 2410s library. Legacy support is provided for sqlite3.Connection objects. The user 2410s is responsible for engine disposal and connection closure for the SQLAlchemy 2410s connectable. See `here \ 2410s `_. 2410s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2410s the transaction will not be committed. If passing a sqlite3.Connection, 2410s it will not be possible to roll back the record insertion. 2410s 2410s schema : str, optional 2410s Specify the schema (if database flavor supports this). If None, use 2410s default schema. 2410s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2410s How to behave if the table already exists. 2410s 2410s * fail: Raise a ValueError. 2410s * replace: Drop the table before inserting new values. 2410s * append: Insert new values to the existing table. 2410s 2410s index : bool, default True 2410s Write DataFrame index as a column. Uses `index_label` as the column 2410s name in the table. Creates a table index for this column. 2410s index_label : str or sequence, default None 2410s Column label for index column(s). If None is given (default) and 2410s `index` is True, then the index names are used. 2410s A sequence should be given if the DataFrame uses MultiIndex. 2410s chunksize : int, optional 2410s Specify the number of rows in each batch to be written at a time. 2410s By default, all rows will be written at once. 2410s dtype : dict or scalar, optional 2410s Specifying the datatype for columns. If a dictionary is used, the 2410s keys should be the column names and the values should be the 2410s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2410s scalar is provided, it will be applied to all columns. 2410s method : {None, 'multi', callable}, optional 2410s Controls the SQL insertion clause used: 2410s 2410s * None : Uses standard SQL ``INSERT`` clause (one per row). 2410s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2410s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2410s 2410s Details and a sample callable implementation can be found in the 2410s section :ref:`insert method `. 2410s 2410s Returns 2410s ------- 2410s None or int 2410s Number of rows affected by to_sql. None is returned if the callable 2410s passed into ``method`` does not return an integer number of rows. 2410s 2410s The number of returned rows affected is the sum of the ``rowcount`` 2410s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2410s reflect the exact number of written rows as stipulated in the 2410s `sqlite3 `__ or 2410s `SQLAlchemy `__. 2410s 2410s .. versionadded:: 1.4.0 2410s 2410s Raises 2410s ------ 2410s ValueError 2410s When the table already exists and `if_exists` is 'fail' (the 2410s default). 2410s 2410s See Also 2410s -------- 2410s read_sql : Read a DataFrame from a table. 2410s 2410s Notes 2410s ----- 2410s Timezone aware datetime columns will be written as 2410s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2410s database. Otherwise, the datetimes will be stored as timezone unaware 2410s timestamps local to the original timezone. 2410s 2410s Not all datastores support ``method="multi"``. Oracle, for example, 2410s does not support multi-value insert. 2410s 2410s References 2410s ---------- 2410s .. [1] https://docs.sqlalchemy.org 2410s .. [2] https://www.python.org/dev/peps/pep-0249/ 2410s 2410s Examples 2410s -------- 2410s Create an in-memory SQLite database. 2410s 2410s >>> from sqlalchemy import create_engine 2410s >>> engine = create_engine('sqlite://', echo=False) 2410s 2410s Create a table from scratch with 3 rows. 2410s 2410s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2410s >>> df 2410s name 2410s 0 User 1 2410s 1 User 2 2410s 2 User 3 2410s 2410s >>> df.to_sql(name='users', con=engine) 2410s 3 2410s >>> from sqlalchemy import text 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2410s 2410s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2410s 2410s >>> with engine.begin() as connection: 2410s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2410s ... df1.to_sql(name='users', con=connection, if_exists='append') 2410s 2 2410s 2410s This is allowed to support operations that require that the same 2410s DBAPI connection is used for the entire operation. 2410s 2410s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2410s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2410s 2 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2410s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2410s (1, 'User 7')] 2410s 2410s Overwrite the table with just ``df2``. 2410s 2410s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2410s ... index_label='id') 2410s 2 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 6'), (1, 'User 7')] 2410s 2410s Use ``method`` to define a callable insertion method to do nothing 2410s if there's a primary key conflict on a table in a PostgreSQL database. 2410s 2410s >>> from sqlalchemy.dialects.postgresql import insert 2410s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2410s ... # "a" is the primary key in "conflict_table" 2410s ... data = [dict(zip(keys, row)) for row in data_iter] 2410s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2410s ... result = conn.execute(stmt) 2410s ... return result.rowcount 2410s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2410s 0 2410s 2410s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2410s on a primary key. 2410s 2410s >>> from sqlalchemy.dialects.mysql import insert 2410s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2410s ... # update columns "b" and "c" on primary key conflict 2410s ... data = [dict(zip(keys, row)) for row in data_iter] 2410s ... stmt = ( 2410s ... insert(table.table) 2410s ... .values(data) 2410s ... ) 2410s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2410s ... result = conn.execute(stmt) 2410s ... return result.rowcount 2410s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2410s 2 2410s 2410s Specify the dtype (especially useful for integers with missing values). 2410s Notice that while pandas is forced to store the data as floating point, 2410s the database supports nullable integers. When fetching the data with 2410s Python, we get back integer scalars. 2410s 2410s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2410s >>> df 2410s A 2410s 0 1.0 2410s 1 NaN 2410s 2 2.0 2410s 2410s >>> from sqlalchemy.types import Integer 2410s >>> df.to_sql(name='integers', con=engine, index=False, 2410s ... dtype={"A": Integer()}) 2410s 3 2410s 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2410s [(1,), (None,), (2,)] 2410s """ # noqa: E501 2410s from pandas.io import sql 2410s 2410s > return sql.to_sql( 2410s self, 2410s name, 2410s con, 2410s schema=schema, 2410s if_exists=if_exists, 2410s index=index, 2410s index_label=index_label, 2410s chunksize=chunksize, 2410s dtype=dtype, 2410s method=method, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s frame = col1 col2 col3 2410s 0 1 0.1 a 2410s 1 2 0.2 n 2410s name = 'test_frame' 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, if_exists = 'fail', index = False, index_label = None 2410s chunksize = None, dtype = None 2410s method = .psql_insert_copy at 0x79e9eafa9f80> 2410s engine = 'auto', engine_kwargs = {} 2410s 2410s def to_sql( 2410s frame, 2410s name: str, 2410s con, 2410s schema: str | None = None, 2410s if_exists: Literal["fail", "replace", "append"] = "fail", 2410s index: bool = True, 2410s index_label: IndexLabel | None = None, 2410s chunksize: int | None = None, 2410s dtype: DtypeArg | None = None, 2410s method: Literal["multi"] | Callable | None = None, 2410s engine: str = "auto", 2410s **engine_kwargs, 2410s ) -> int | None: 2410s """ 2410s Write records stored in a DataFrame to a SQL database. 2410s 2410s Parameters 2410s ---------- 2410s frame : DataFrame, Series 2410s name : str 2410s Name of SQL table. 2410s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2410s or sqlite3 DBAPI2 connection 2410s ADBC provides high performance I/O with native type support, where available. 2410s Using SQLAlchemy makes it possible to use any DB supported by that 2410s library. 2410s If a DBAPI2 object, only sqlite3 is supported. 2410s schema : str, optional 2410s Name of SQL schema in database to write to (if database flavor 2410s supports this). If None, use default schema (default). 2410s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2410s - fail: If table exists, do nothing. 2410s - replace: If table exists, drop it, recreate it, and insert data. 2410s - append: If table exists, insert data. Create if does not exist. 2410s index : bool, default True 2410s Write DataFrame index as a column. 2410s index_label : str or sequence, optional 2410s Column label for index column(s). If None is given (default) and 2410s `index` is True, then the index names are used. 2410s A sequence should be given if the DataFrame uses MultiIndex. 2410s chunksize : int, optional 2410s Specify the number of rows in each batch to be written at a time. 2410s By default, all rows will be written at once. 2410s dtype : dict or scalar, optional 2410s Specifying the datatype for columns. If a dictionary is used, the 2410s keys should be the column names and the values should be the 2410s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2410s scalar is provided, it will be applied to all columns. 2410s method : {None, 'multi', callable}, optional 2410s Controls the SQL insertion clause used: 2410s 2410s - None : Uses standard SQL ``INSERT`` clause (one per row). 2410s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2410s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2410s 2410s Details and a sample callable implementation can be found in the 2410s section :ref:`insert method `. 2410s engine : {'auto', 'sqlalchemy'}, default 'auto' 2410s SQL engine library to use. If 'auto', then the option 2410s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2410s behavior is 'sqlalchemy' 2410s 2410s .. versionadded:: 1.3.0 2410s 2410s **engine_kwargs 2410s Any additional kwargs are passed to the engine. 2410s 2410s Returns 2410s ------- 2410s None or int 2410s Number of rows affected by to_sql. None is returned if the callable 2410s passed into ``method`` does not return an integer number of rows. 2410s 2410s .. versionadded:: 1.4.0 2410s 2410s Notes 2410s ----- 2410s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2410s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2410s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2410s rows as stipulated in the 2410s `sqlite3 `__ or 2410s `SQLAlchemy `__ 2410s """ # noqa: E501 2410s if if_exists not in ("fail", "replace", "append"): 2410s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2410s 2410s if isinstance(frame, Series): 2410s frame = frame.to_frame() 2410s elif not isinstance(frame, DataFrame): 2410s raise NotImplementedError( 2410s "'frame' argument should be either a Series or a DataFrame" 2410s ) 2410s 2410s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def pandasSQL_builder( 2410s con, 2410s schema: str | None = None, 2410s need_transaction: bool = False, 2410s ) -> PandasSQL: 2410s """ 2410s Convenience function to return the correct PandasSQL subclass based on the 2410s provided parameters. Also creates a sqlalchemy connection and transaction 2410s if necessary. 2410s """ 2410s import sqlite3 2410s 2410s if isinstance(con, sqlite3.Connection) or con is None: 2410s return SQLiteDatabase(con) 2410s 2410s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2410s 2410s if isinstance(con, str) and sqlalchemy is None: 2410s raise ImportError("Using URI string without sqlalchemy installed.") 2410s 2410s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2410s > return SQLDatabase(con, schema, need_transaction) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = True 2410s 2410s def __init__( 2410s self, con, schema: str | None = None, need_transaction: bool = False 2410s ) -> None: 2410s from sqlalchemy import create_engine 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.schema import MetaData 2410s 2410s # self.exit_stack cleans up the Engine and Connection and commits the 2410s # transaction if any of those objects was created below. 2410s # Cleanup happens either in self.__exit__ or at the end of the iterator 2410s # returned by read_sql when chunksize is not None. 2410s self.exit_stack = ExitStack() 2410s if isinstance(con, str): 2410s con = create_engine(con) 2410s self.exit_stack.callback(con.dispose) 2410s if isinstance(con, Engine): 2410s > con = self.exit_stack.enter_context(con.connect()) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _ test_copy_from_callable_insertion_method[Success!-postgresql_psycopg2_conn] __ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn', expected_count = 'Success!' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", postgresql_connectable) 2410s @pytest.mark.parametrize("expected_count", [2, "Success!"]) 2410s def test_copy_from_callable_insertion_method(conn, expected_count, request): 2410s # GH 8953 2410s # Example in io.rst found under _io.sql.method 2410s # not available in sqlite, mysql 2410s def psql_insert_copy(table, conn, keys, data_iter): 2410s # gets a DBAPI connection that can provide a cursor 2410s dbapi_conn = conn.connection 2410s with dbapi_conn.cursor() as cur: 2410s s_buf = StringIO() 2410s writer = csv.writer(s_buf) 2410s writer.writerows(data_iter) 2410s s_buf.seek(0) 2410s 2410s columns = ", ".join([f'"{k}"' for k in keys]) 2410s if table.schema: 2410s table_name = f"{table.schema}.{table.name}" 2410s else: 2410s table_name = table.name 2410s 2410s sql_query = f"COPY {table_name} ({columns}) FROM STDIN WITH CSV" 2410s cur.copy_expert(sql=sql_query, file=s_buf) 2410s return expected_count 2410s 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1304: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ___ test_insertion_method_on_conflict_do_nothing[postgresql_psycopg2_engine] ___ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", postgresql_connectable) 2410s def test_insertion_method_on_conflict_do_nothing(conn, request): 2410s # GH 15988: Example in to_sql docstring 2410s conn = request.getfixturevalue(conn) 2410s 2410s from sqlalchemy.dialects.postgresql import insert 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.sql import text 2410s 2410s def insert_on_conflict(table, conn, keys, data_iter): 2410s data = [dict(zip(keys, row)) for row in data_iter] 2410s stmt = ( 2410s insert(table.table) 2410s .values(data) 2410s .on_conflict_do_nothing(index_elements=["a"]) 2410s ) 2410s result = conn.execute(stmt) 2410s return result.rowcount 2410s 2410s create_sql = text( 2410s """ 2410s CREATE TABLE test_insert_conflict ( 2410s a integer PRIMARY KEY, 2410s b numeric, 2410s c text 2410s ); 2410s """ 2410s ) 2410s if isinstance(conn, Engine): 2410s > with conn.connect() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1347: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____ test_insertion_method_on_conflict_do_nothing[postgresql_psycopg2_conn] ____ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", postgresql_connectable) 2410s def test_insertion_method_on_conflict_do_nothing(conn, request): 2410s # GH 15988: Example in to_sql docstring 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1321: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ___________ test_to_sql_on_public_schema[postgresql_psycopg2_engine] ___________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_to_sql_on_public_schema(conn, request): 2410s if "sqlite" in conn or "mysql" in conn: 2410s request.applymarker( 2410s pytest.mark.xfail( 2410s reason="test for public schema only specific to postgresql" 2410s ) 2410s ) 2410s 2410s conn = request.getfixturevalue(conn) 2410s 2410s test_data = DataFrame([[1, 2.1, "a"], [2, 3.1, "b"]], columns=list("abc")) 2410s > test_data.to_sql( 2410s name="test_public_schema", 2410s con=conn, 2410s if_exists="append", 2410s index=False, 2410s schema="public", 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1388: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ( a b c 2410s 0 1 2.1 a 2410s 1 2 3.1 b,) 2410s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'append', 'index': False, 'name': 'test_public_schema', ...} 2410s 2410s @wraps(func) 2410s def wrapper(*args, **kwargs): 2410s if len(args) > num_allow_args: 2410s warnings.warn( 2410s msg.format(arguments=_format_argument_list(allow_args)), 2410s FutureWarning, 2410s stacklevel=find_stack_level(), 2410s ) 2410s > return func(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = a b c 2410s 0 1 2.1 a 2410s 1 2 3.1 b, name = 'test_public_schema' 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = 'public', if_exists = 'append', index = False, index_label = None 2410s chunksize = None, dtype = None, method = None 2410s 2410s @final 2410s @deprecate_nonkeyword_arguments( 2410s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2410s ) 2410s def to_sql( 2410s self, 2410s name: str, 2410s con, 2410s schema: str | None = None, 2410s if_exists: Literal["fail", "replace", "append"] = "fail", 2410s index: bool_t = True, 2410s index_label: IndexLabel | None = None, 2410s chunksize: int | None = None, 2410s dtype: DtypeArg | None = None, 2410s method: Literal["multi"] | Callable | None = None, 2410s ) -> int | None: 2410s """ 2410s Write records stored in a DataFrame to a SQL database. 2410s 2410s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2410s newly created, appended to, or overwritten. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s Name of SQL table. 2410s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2410s Using SQLAlchemy makes it possible to use any DB supported by that 2410s library. Legacy support is provided for sqlite3.Connection objects. The user 2410s is responsible for engine disposal and connection closure for the SQLAlchemy 2410s connectable. See `here \ 2410s `_. 2410s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2410s the transaction will not be committed. If passing a sqlite3.Connection, 2410s it will not be possible to roll back the record insertion. 2410s 2410s schema : str, optional 2410s Specify the schema (if database flavor supports this). If None, use 2410s default schema. 2410s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2410s How to behave if the table already exists. 2410s 2410s * fail: Raise a ValueError. 2410s * replace: Drop the table before inserting new values. 2410s * append: Insert new values to the existing table. 2410s 2410s index : bool, default True 2410s Write DataFrame index as a column. Uses `index_label` as the column 2410s name in the table. Creates a table index for this column. 2410s index_label : str or sequence, default None 2410s Column label for index column(s). If None is given (default) and 2410s `index` is True, then the index names are used. 2410s A sequence should be given if the DataFrame uses MultiIndex. 2410s chunksize : int, optional 2410s Specify the number of rows in each batch to be written at a time. 2410s By default, all rows will be written at once. 2410s dtype : dict or scalar, optional 2410s Specifying the datatype for columns. If a dictionary is used, the 2410s keys should be the column names and the values should be the 2410s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2410s scalar is provided, it will be applied to all columns. 2410s method : {None, 'multi', callable}, optional 2410s Controls the SQL insertion clause used: 2410s 2410s * None : Uses standard SQL ``INSERT`` clause (one per row). 2410s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2410s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2410s 2410s Details and a sample callable implementation can be found in the 2410s section :ref:`insert method `. 2410s 2410s Returns 2410s ------- 2410s None or int 2410s Number of rows affected by to_sql. None is returned if the callable 2410s passed into ``method`` does not return an integer number of rows. 2410s 2410s The number of returned rows affected is the sum of the ``rowcount`` 2410s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2410s reflect the exact number of written rows as stipulated in the 2410s `sqlite3 `__ or 2410s `SQLAlchemy `__. 2410s 2410s .. versionadded:: 1.4.0 2410s 2410s Raises 2410s ------ 2410s ValueError 2410s When the table already exists and `if_exists` is 'fail' (the 2410s default). 2410s 2410s See Also 2410s -------- 2410s read_sql : Read a DataFrame from a table. 2410s 2410s Notes 2410s ----- 2410s Timezone aware datetime columns will be written as 2410s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2410s database. Otherwise, the datetimes will be stored as timezone unaware 2410s timestamps local to the original timezone. 2410s 2410s Not all datastores support ``method="multi"``. Oracle, for example, 2410s does not support multi-value insert. 2410s 2410s References 2410s ---------- 2410s .. [1] https://docs.sqlalchemy.org 2410s .. [2] https://www.python.org/dev/peps/pep-0249/ 2410s 2410s Examples 2410s -------- 2410s Create an in-memory SQLite database. 2410s 2410s >>> from sqlalchemy import create_engine 2410s >>> engine = create_engine('sqlite://', echo=False) 2410s 2410s Create a table from scratch with 3 rows. 2410s 2410s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2410s >>> df 2410s name 2410s 0 User 1 2410s 1 User 2 2410s 2 User 3 2410s 2410s >>> df.to_sql(name='users', con=engine) 2410s 3 2410s >>> from sqlalchemy import text 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2410s 2410s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2410s 2410s >>> with engine.begin() as connection: 2410s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2410s ... df1.to_sql(name='users', con=connection, if_exists='append') 2410s 2 2410s 2410s This is allowed to support operations that require that the same 2410s DBAPI connection is used for the entire operation. 2410s 2410s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2410s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2410s 2 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2410s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2410s (1, 'User 7')] 2410s 2410s Overwrite the table with just ``df2``. 2410s 2410s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2410s ... index_label='id') 2410s 2 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM users")).fetchall() 2410s [(0, 'User 6'), (1, 'User 7')] 2410s 2410s Use ``method`` to define a callable insertion method to do nothing 2410s if there's a primary key conflict on a table in a PostgreSQL database. 2410s 2410s >>> from sqlalchemy.dialects.postgresql import insert 2410s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2410s ... # "a" is the primary key in "conflict_table" 2410s ... data = [dict(zip(keys, row)) for row in data_iter] 2410s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2410s ... result = conn.execute(stmt) 2410s ... return result.rowcount 2410s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2410s 0 2410s 2410s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2410s on a primary key. 2410s 2410s >>> from sqlalchemy.dialects.mysql import insert 2410s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2410s ... # update columns "b" and "c" on primary key conflict 2410s ... data = [dict(zip(keys, row)) for row in data_iter] 2410s ... stmt = ( 2410s ... insert(table.table) 2410s ... .values(data) 2410s ... ) 2410s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2410s ... result = conn.execute(stmt) 2410s ... return result.rowcount 2410s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2410s 2 2410s 2410s Specify the dtype (especially useful for integers with missing values). 2410s Notice that while pandas is forced to store the data as floating point, 2410s the database supports nullable integers. When fetching the data with 2410s Python, we get back integer scalars. 2410s 2410s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2410s >>> df 2410s A 2410s 0 1.0 2410s 1 NaN 2410s 2 2.0 2410s 2410s >>> from sqlalchemy.types import Integer 2410s >>> df.to_sql(name='integers', con=engine, index=False, 2410s ... dtype={"A": Integer()}) 2410s 3 2410s 2410s >>> with engine.connect() as conn: 2410s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2410s [(1,), (None,), (2,)] 2410s """ # noqa: E501 2410s from pandas.io import sql 2410s 2410s > return sql.to_sql( 2410s self, 2410s name, 2410s con, 2410s schema=schema, 2410s if_exists=if_exists, 2410s index=index, 2410s index_label=index_label, 2410s chunksize=chunksize, 2410s dtype=dtype, 2410s method=method, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s frame = a b c 2410s 0 1 2.1 a 2410s 1 2 3.1 b, name = 'test_public_schema' 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = 'public', if_exists = 'append', index = False, index_label = None 2410s chunksize = None, dtype = None, method = None, engine = 'auto' 2410s engine_kwargs = {} 2410s 2410s def to_sql( 2410s frame, 2410s name: str, 2410s con, 2410s schema: str | None = None, 2410s if_exists: Literal["fail", "replace", "append"] = "fail", 2410s index: bool = True, 2410s index_label: IndexLabel | None = None, 2410s chunksize: int | None = None, 2410s dtype: DtypeArg | None = None, 2410s method: Literal["multi"] | Callable | None = None, 2410s engine: str = "auto", 2410s **engine_kwargs, 2410s ) -> int | None: 2410s """ 2410s Write records stored in a DataFrame to a SQL database. 2410s 2410s Parameters 2410s ---------- 2410s frame : DataFrame, Series 2410s name : str 2410s Name of SQL table. 2410s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2410s or sqlite3 DBAPI2 connection 2410s ADBC provides high performance I/O with native type support, where available. 2410s Using SQLAlchemy makes it possible to use any DB supported by that 2410s library. 2410s If a DBAPI2 object, only sqlite3 is supported. 2410s schema : str, optional 2410s Name of SQL schema in database to write to (if database flavor 2410s supports this). If None, use default schema (default). 2410s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2410s - fail: If table exists, do nothing. 2410s - replace: If table exists, drop it, recreate it, and insert data. 2410s - append: If table exists, insert data. Create if does not exist. 2410s index : bool, default True 2410s Write DataFrame index as a column. 2410s index_label : str or sequence, optional 2410s Column label for index column(s). If None is given (default) and 2410s `index` is True, then the index names are used. 2410s A sequence should be given if the DataFrame uses MultiIndex. 2410s chunksize : int, optional 2410s Specify the number of rows in each batch to be written at a time. 2410s By default, all rows will be written at once. 2410s dtype : dict or scalar, optional 2410s Specifying the datatype for columns. If a dictionary is used, the 2410s keys should be the column names and the values should be the 2410s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2410s scalar is provided, it will be applied to all columns. 2410s method : {None, 'multi', callable}, optional 2410s Controls the SQL insertion clause used: 2410s 2410s - None : Uses standard SQL ``INSERT`` clause (one per row). 2410s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2410s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2410s 2410s Details and a sample callable implementation can be found in the 2410s section :ref:`insert method `. 2410s engine : {'auto', 'sqlalchemy'}, default 'auto' 2410s SQL engine library to use. If 'auto', then the option 2410s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2410s behavior is 'sqlalchemy' 2410s 2410s .. versionadded:: 1.3.0 2410s 2410s **engine_kwargs 2410s Any additional kwargs are passed to the engine. 2410s 2410s Returns 2410s ------- 2410s None or int 2410s Number of rows affected by to_sql. None is returned if the callable 2410s passed into ``method`` does not return an integer number of rows. 2410s 2410s .. versionadded:: 1.4.0 2410s 2410s Notes 2410s ----- 2410s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2410s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2410s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2410s rows as stipulated in the 2410s `sqlite3 `__ or 2410s `SQLAlchemy `__ 2410s """ # noqa: E501 2410s if if_exists not in ("fail", "replace", "append"): 2410s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2410s 2410s if isinstance(frame, Series): 2410s frame = frame.to_frame() 2410s elif not isinstance(frame, DataFrame): 2410s raise NotImplementedError( 2410s "'frame' argument should be either a Series or a DataFrame" 2410s ) 2410s 2410s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = 'public', need_transaction = True 2410s 2410s def pandasSQL_builder( 2410s con, 2410s schema: str | None = None, 2410s need_transaction: bool = False, 2410s ) -> PandasSQL: 2410s """ 2410s Convenience function to return the correct PandasSQL subclass based on the 2410s provided parameters. Also creates a sqlalchemy connection and transaction 2410s if necessary. 2410s """ 2410s import sqlite3 2410s 2410s if isinstance(con, sqlite3.Connection) or con is None: 2410s return SQLiteDatabase(con) 2410s 2410s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2410s 2410s if isinstance(con, str) and sqlalchemy is None: 2410s raise ImportError("Using URI string without sqlalchemy installed.") 2410s 2410s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2410s > return SQLDatabase(con, schema, need_transaction) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = 'public', need_transaction = True 2410s 2410s def __init__( 2410s self, con, schema: str | None = None, need_transaction: bool = False 2410s ) -> None: 2410s from sqlalchemy import create_engine 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.schema import MetaData 2410s 2410s # self.exit_stack cleans up the Engine and Connection and commits the 2410s # transaction if any of those objects was created below. 2410s # Cleanup happens either in self.__exit__ or at the end of the iterator 2410s # returned by read_sql when chunksize is not None. 2410s self.exit_stack = ExitStack() 2410s if isinstance(con, str): 2410s con = create_engine(con) 2410s self.exit_stack.callback(con.dispose) 2410s if isinstance(con, Engine): 2410s > con = self.exit_stack.enter_context(con.connect()) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____________ test_to_sql_on_public_schema[postgresql_psycopg2_conn] ____________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_to_sql_on_public_schema(conn, request): 2410s if "sqlite" in conn or "mysql" in conn: 2410s request.applymarker( 2410s pytest.mark.xfail( 2410s reason="test for public schema only specific to postgresql" 2410s ) 2410s ) 2410s 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1385: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ________ test_insertion_method_on_conflict_update[mysql_pymysql_engine] ________ 2410s conn = 'mysql_pymysql_engine' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", mysql_connectable) 2410s def test_insertion_method_on_conflict_update(conn, request): 2410s # GH 14553: Example in to_sql docstring 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1403: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4eab70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4eab70> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _________ test_insertion_method_on_conflict_update[mysql_pymysql_conn] _________ 2410s conn = 'mysql_pymysql_conn' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", mysql_connectable) 2410s def test_insertion_method_on_conflict_update(conn, request): 2410s # GH 14553: Example in to_sql docstring 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1403: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4eac30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4eac30> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _____________ test_read_view_postgres[postgresql_psycopg2_engine] ______________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", postgresql_connectable) 2410s def test_read_view_postgres(conn, request): 2410s # GH 52969 2410s conn = request.getfixturevalue(conn) 2410s 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.sql import text 2410s 2410s table_name = f"group_{uuid.uuid4().hex}" 2410s view_name = f"group_view_{uuid.uuid4().hex}" 2410s 2410s sql_stmt = text( 2410s f""" 2410s CREATE TABLE {table_name} ( 2410s group_id INTEGER, 2410s name TEXT 2410s ); 2410s INSERT INTO {table_name} VALUES 2410s (1, 'name'); 2410s CREATE VIEW {view_name} 2410s AS 2410s SELECT * FROM {table_name}; 2410s """ 2410s ) 2410s if isinstance(conn, Engine): 2410s > with conn.connect() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1478: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______________ test_read_view_postgres[postgresql_psycopg2_conn] _______________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", postgresql_connectable) 2410s def test_read_view_postgres(conn, request): 2410s # GH 52969 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1456: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ___________ test_read_sql_iris_parameter[mysql_pymysql_engine_iris] ____________ 2410s conn = 'mysql_pymysql_engine_iris' 2410s request = > 2410s sql_strings = {'read_named_parameters': {'mysql': '\n SELECT * FROM iris WHERE\n `Name`=%(name)s AND `...LECT * FROM iris WHERE "Name"=%s AND "SepalLength"=%s', 'sqlite': 'SELECT * FROM iris WHERE Name=? AND SepalLength=?'}} 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_sql_iris_parameter(conn, request, sql_strings): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'params' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s conn_name = conn 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1555: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4eb170>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4eb170> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ____________ test_read_sql_iris_parameter[mysql_pymysql_conn_iris] _____________ 2410s conn = 'mysql_pymysql_conn_iris' 2410s request = > 2410s sql_strings = {'read_named_parameters': {'mysql': '\n SELECT * FROM iris WHERE\n `Name`=%(name)s AND `...LECT * FROM iris WHERE "Name"=%s AND "SepalLength"=%s', 'sqlite': 'SELECT * FROM iris WHERE Name=? AND SepalLength=?'}} 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_sql_iris_parameter(conn, request, sql_strings): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'params' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s conn_name = conn 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1555: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4eb290>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4eb290> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ________ test_read_sql_iris_parameter[postgresql_psycopg2_engine_iris] _________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_engine_iris' 2410s request = > 2410s sql_strings = {'read_named_parameters': {'mysql': '\n SELECT * FROM iris WHERE\n `Name`=%(name)s AND `...LECT * FROM iris WHERE "Name"=%s AND "SepalLength"=%s', 'sqlite': 'SELECT * FROM iris WHERE Name=? AND SepalLength=?'}} 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_sql_iris_parameter(conn, request, sql_strings): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'params' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s conn_name = conn 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1555: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque...equest 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque...equest 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s _________ test_read_sql_iris_parameter[postgresql_psycopg2_conn_iris] __________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn_iris' 2410s request = > 2410s sql_strings = {'read_named_parameters': {'mysql': '\n SELECT * FROM iris WHERE\n `Name`=%(name)s AND `...LECT * FROM iris WHERE "Name"=%s AND "SepalLength"=%s', 'sqlite': 'SELECT * FROM iris WHERE Name=? AND SepalLength=?'}} 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_sql_iris_parameter(conn, request, sql_strings): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'params' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s conn_name = conn 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1555: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque...bRequest 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque...bRequest 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ________ test_read_sql_iris_named_parameter[mysql_pymysql_engine_iris] _________ 2410s conn = 'mysql_pymysql_engine_iris' 2410s request = > 2410s sql_strings = {'read_named_parameters': {'mysql': '\n SELECT * FROM iris WHERE\n `Name`=%(name)s AND `...LECT * FROM iris WHERE "Name"=%s AND "SepalLength"=%s', 'sqlite': 'SELECT * FROM iris WHERE Name=? AND SepalLength=?'}} 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_sql_iris_named_parameter(conn, request, sql_strings): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'params' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s 2410s conn_name = conn 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1575: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4ebbf0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4ebbf0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _________ test_read_sql_iris_named_parameter[mysql_pymysql_conn_iris] __________ 2410s conn = 'mysql_pymysql_conn_iris' 2410s request = > 2410s sql_strings = {'read_named_parameters': {'mysql': '\n SELECT * FROM iris WHERE\n `Name`=%(name)s AND `...LECT * FROM iris WHERE "Name"=%s AND "SepalLength"=%s', 'sqlite': 'SELECT * FROM iris WHERE Name=? AND SepalLength=?'}} 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_sql_iris_named_parameter(conn, request, sql_strings): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'params' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s 2410s conn_name = conn 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1575: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4ebcb0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea4ebcb0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _____ test_read_sql_iris_named_parameter[postgresql_psycopg2_engine_iris] ______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_engine_iris' 2410s request = > 2410s sql_strings = {'read_named_parameters': {'mysql': '\n SELECT * FROM iris WHERE\n `Name`=%(name)s AND `...LECT * FROM iris WHERE "Name"=%s AND "SepalLength"=%s', 'sqlite': 'SELECT * FROM iris WHERE Name=? AND SepalLength=?'}} 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_sql_iris_named_parameter(conn, request, sql_strings): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'params' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s 2410s conn_name = conn 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1575: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque... 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque... 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______ test_read_sql_iris_named_parameter[postgresql_psycopg2_conn_iris] _______ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn_iris' 2410s request = > 2410s sql_strings = {'read_named_parameters': {'mysql': '\n SELECT * FROM iris WHERE\n `Name`=%(name)s AND `...LECT * FROM iris WHERE "Name"=%s AND "SepalLength"=%s', 'sqlite': 'SELECT * FROM iris WHERE Name=? AND SepalLength=?'}} 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_read_sql_iris_named_parameter(conn, request, sql_strings): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail( 2410s reason="'params' not implemented for ADBC drivers", 2410s strict=True, 2410s ) 2410s ) 2410s 2410s conn_name = conn 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1575: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque...st 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque...st 'postgresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ______________ test_api_read_sql_view[mysql_pymysql_engine_iris] _______________ 2410s conn = 'mysql_pymysql_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_api_read_sql_view(conn, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea498d10>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea498d10> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _______________ test_api_read_sql_view[mysql_pymysql_conn_iris] ________________ 2410s conn = 'mysql_pymysql_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_api_read_sql_view(conn, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea498e30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea498e30> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ___________ test_api_read_sql_view[postgresql_psycopg2_engine_iris] ____________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_api_read_sql_view(conn, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque... >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque... >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____________ test_api_read_sql_view[postgresql_psycopg2_conn_iris] _____________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_api_read_sql_view(conn, request): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____ test_api_read_sql_with_chunksize_no_result[mysql_pymysql_engine_iris] _____ 2410s conn = 'mysql_pymysql_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_api_read_sql_with_chunksize_no_result(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1616: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea499850>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea499850> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _____ test_api_read_sql_with_chunksize_no_result[mysql_pymysql_conn_iris] ______ 2410s conn = 'mysql_pymysql_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_api_read_sql_with_chunksize_no_result(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1616: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea499910>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea499910> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _ test_api_read_sql_with_chunksize_no_result[postgresql_psycopg2_engine_iris] __ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_engine_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_api_read_sql_with_chunksize_no_result(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1616: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque...esql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque...esql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __ test_api_read_sql_with_chunksize_no_result[postgresql_psycopg2_conn_iris] ___ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn_iris' 2410s request = > 2410s 2410s @pytest.mark.parametrize("conn", all_connectable_iris) 2410s def test_api_read_sql_with_chunksize_no_result(conn, request): 2410s if "adbc" in conn: 2410s request.node.add_marker( 2410s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2410s ) 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1616: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_engine_iris' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'reque...gresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'reque...gresql_psycopg2_engine_iris' for >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2410s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2410s 2410s def create_and_load_iris(conn, iris_file: Path): 2410s from sqlalchemy import insert 2410s 2410s iris = iris_table_metadata() 2410s 2410s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2410s reader = csv.reader(csvfile) 2410s header = next(reader) 2410s params = [dict(zip(header, row)) for row in reader] 2410s stmt = insert(iris).values(params) 2410s > with conn.begin() as con: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __enter__(self): 2410s # do not keep args and kwds alive unnecessarily 2410s # they are only needed for recreation, which is not possible anymore 2410s del self.args, self.kwds, self.func 2410s try: 2410s > return next(self.gen) 2410s 2410s /usr/lib/python3.13/contextlib.py:141: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @contextlib.contextmanager 2410s def begin(self) -> Iterator[Connection]: 2410s """Return a context manager delivering a :class:`_engine.Connection` 2410s with a :class:`.Transaction` established. 2410s 2410s E.g.:: 2410s 2410s with engine.begin() as conn: 2410s conn.execute( 2410s text("insert into table (x, y, z) values (1, 2, 3)") 2410s ) 2410s conn.execute(text("my_special_procedure(5)")) 2410s 2410s Upon successful operation, the :class:`.Transaction` 2410s is committed. If an error is raised, the :class:`.Transaction` 2410s is rolled back. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.connect` - procure a 2410s :class:`_engine.Connection` from 2410s an :class:`_engine.Engine`. 2410s 2410s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2410s for a particular :class:`_engine.Connection`. 2410s 2410s """ 2410s > with self.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s ____________________ test_api_to_sql[mysql_pymysql_engine] _____________________ 2410s conn = 'mysql_pymysql_engine' 2410s request = > 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_api_to_sql(conn, request, test_frame1): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1625: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea49a510>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea49a510> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _____________________ test_api_to_sql[mysql_pymysql_conn] ______________________ 2410s conn = 'mysql_pymysql_conn' 2410s request = > 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_api_to_sql(conn, request, test_frame1): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1625: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea49a630>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea49a630> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s _________________ test_api_to_sql[postgresql_psycopg2_engine] __________________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s request = > 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_api_to_sql(conn, request, test_frame1): 2410s conn = request.getfixturevalue(conn) 2410s > if sql.has_table("test_frame1", conn): 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1626: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s table_name = 'test_frame1' 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None 2410s 2410s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2410s """ 2410s Check if DataBase has named table. 2410s 2410s Parameters 2410s ---------- 2410s table_name: string 2410s Name of SQL table. 2410s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2410s ADBC provides high performance I/O with native type support, where available. 2410s Using SQLAlchemy makes it possible to use any DB supported by that 2410s library. 2410s If a DBAPI2 object, only sqlite3 is supported. 2410s schema : string, default None 2410s Name of SQL schema in database to write to (if database flavor supports 2410s this). If None, use default schema (default). 2410s 2410s Returns 2410s ------- 2410s boolean 2410s """ 2410s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = False 2410s 2410s def pandasSQL_builder( 2410s con, 2410s schema: str | None = None, 2410s need_transaction: bool = False, 2410s ) -> PandasSQL: 2410s """ 2410s Convenience function to return the correct PandasSQL subclass based on the 2410s provided parameters. Also creates a sqlalchemy connection and transaction 2410s if necessary. 2410s """ 2410s import sqlite3 2410s 2410s if isinstance(con, sqlite3.Connection) or con is None: 2410s return SQLiteDatabase(con) 2410s 2410s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2410s 2410s if isinstance(con, str) and sqlalchemy is None: 2410s raise ImportError("Using URI string without sqlalchemy installed.") 2410s 2410s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2410s > return SQLDatabase(con, schema, need_transaction) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s schema = None, need_transaction = False 2410s 2410s def __init__( 2410s self, con, schema: str | None = None, need_transaction: bool = False 2410s ) -> None: 2410s from sqlalchemy import create_engine 2410s from sqlalchemy.engine import Engine 2410s from sqlalchemy.schema import MetaData 2410s 2410s # self.exit_stack cleans up the Engine and Connection and commits the 2410s # transaction if any of those objects was created below. 2410s # Cleanup happens either in self.__exit__ or at the end of the iterator 2410s # returned by read_sql when chunksize is not None. 2410s self.exit_stack = ExitStack() 2410s if isinstance(con, str): 2410s con = create_engine(con) 2410s self.exit_stack.callback(con.dispose) 2410s if isinstance(con, Engine): 2410s > con = self.exit_stack.enter_context(con.connect()) 2410s 2410s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __________________ test_api_to_sql[postgresql_psycopg2_conn] ___________________ 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s 2410s The above exception was the direct cause of the following exception: 2410s 2410s conn = 'postgresql_psycopg2_conn' 2410s request = > 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_api_to_sql(conn, request, test_frame1): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1625: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'postgresql_psycopg2_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s @pytest.fixture 2410s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2410s > with postgresql_psycopg2_engine.connect() as conn: 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def connect(self) -> Connection: 2410s """Return a new :class:`_engine.Connection` object. 2410s 2410s The :class:`_engine.Connection` acts as a Python context manager, so 2410s the typical use of this method looks like:: 2410s 2410s with engine.connect() as connection: 2410s connection.execute(text("insert into table values ('foo')")) 2410s connection.commit() 2410s 2410s Where above, after the block is completed, the connection is "closed" 2410s and its underlying DBAPI resources are returned to the connection pool. 2410s This also has the effect of rolling back any transaction that 2410s was explicitly begun or was begun via autobegin, and will 2410s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2410s started and is still in progress. 2410s 2410s .. seealso:: 2410s 2410s :meth:`_engine.Engine.begin` 2410s 2410s """ 2410s 2410s > return self._connection_cls(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s self._dbapi_connection = engine.raw_connection() 2410s except dialect.loaded_dbapi.Error as err: 2410s > Connection._handle_dbapi_exception_noconnection( 2410s err, dialect, engine 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2410s dialect = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2410s 2410s @classmethod 2410s def _handle_dbapi_exception_noconnection( 2410s cls, 2410s e: BaseException, 2410s dialect: Dialect, 2410s engine: Optional[Engine] = None, 2410s is_disconnect: Optional[bool] = None, 2410s invalidate_pool_on_disconnect: bool = True, 2410s is_pre_ping: bool = False, 2410s ) -> NoReturn: 2410s exc_info = sys.exc_info() 2410s 2410s if is_disconnect is None: 2410s is_disconnect = isinstance( 2410s e, dialect.loaded_dbapi.Error 2410s ) and dialect.is_disconnect(e, None, None) 2410s 2410s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2410s 2410s if should_wrap: 2410s sqlalchemy_exception = exc.DBAPIError.instance( 2410s None, 2410s None, 2410s cast(Exception, e), 2410s dialect.loaded_dbapi.Error, 2410s hide_parameters=( 2410s engine.hide_parameters if engine is not None else False 2410s ), 2410s connection_invalidated=is_disconnect, 2410s dialect=dialect, 2410s ) 2410s else: 2410s sqlalchemy_exception = None 2410s 2410s newraise = None 2410s 2410s if dialect._has_events: 2410s ctx = ExceptionContextImpl( 2410s e, 2410s sqlalchemy_exception, 2410s engine, 2410s dialect, 2410s None, 2410s None, 2410s None, 2410s None, 2410s None, 2410s is_disconnect, 2410s invalidate_pool_on_disconnect, 2410s is_pre_ping, 2410s ) 2410s for fn in dialect.dispatch.handle_error: 2410s try: 2410s # handler returns an exception; 2410s # call next handler in a chain 2410s per_fn = fn(ctx) 2410s if per_fn is not None: 2410s ctx.chained_exception = newraise = per_fn 2410s except Exception as _raised: 2410s # handler raises an exception - stop processing 2410s newraise = _raised 2410s break 2410s 2410s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2410s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2410s ctx.is_disconnect 2410s ) 2410s 2410s if newraise: 2410s raise newraise.with_traceback(exc_info[2]) from e 2410s elif should_wrap: 2410s assert sqlalchemy_exception is not None 2410s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s connection = None, _has_events = None, _allow_revalidate = True 2410s _allow_autobegin = True 2410s 2410s def __init__( 2410s self, 2410s engine: Engine, 2410s connection: Optional[PoolProxiedConnection] = None, 2410s _has_events: Optional[bool] = None, 2410s _allow_revalidate: bool = True, 2410s _allow_autobegin: bool = True, 2410s ): 2410s """Construct a new Connection.""" 2410s self.engine = engine 2410s self.dialect = dialect = engine.dialect 2410s 2410s if connection is None: 2410s try: 2410s > self._dbapi_connection = engine.raw_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2410s 2410s def raw_connection(self) -> PoolProxiedConnection: 2410s """Return a "raw" DBAPI connection from the connection pool. 2410s 2410s The returned object is a proxied version of the DBAPI 2410s connection object used by the underlying driver in use. 2410s The object will have all the same behavior as the real DBAPI 2410s connection, except that its ``close()`` method will result in the 2410s connection being returned to the pool, rather than being closed 2410s for real. 2410s 2410s This method provides direct DBAPI connection access for 2410s special situations when the API provided by 2410s :class:`_engine.Connection` 2410s is not needed. When a :class:`_engine.Connection` object is already 2410s present, the DBAPI connection is available using 2410s the :attr:`_engine.Connection.connection` accessor. 2410s 2410s .. seealso:: 2410s 2410s :ref:`dbapi_connections` 2410s 2410s """ 2410s > return self.pool.connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def connect(self) -> PoolProxiedConnection: 2410s """Return a DBAPI connection from the pool. 2410s 2410s The connection is instrumented such that when its 2410s ``close()`` method is called, the connection will be returned to 2410s the pool. 2410s 2410s """ 2410s > return _ConnectionFairy._checkout(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s threadconns = None, fairy = None 2410s 2410s @classmethod 2410s def _checkout( 2410s cls, 2410s pool: Pool, 2410s threadconns: Optional[threading.local] = None, 2410s fairy: Optional[_ConnectionFairy] = None, 2410s ) -> _ConnectionFairy: 2410s if not fairy: 2410s > fairy = _ConnectionRecord.checkout(pool) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s cls = 2410s pool = 2410s 2410s @classmethod 2410s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2410s if TYPE_CHECKING: 2410s rec = cast(_ConnectionRecord, pool._do_get()) 2410s else: 2410s > rec = pool._do_get() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _do_get(self) -> ConnectionPoolEntry: 2410s > return self._create_connection() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def _create_connection(self) -> ConnectionPoolEntry: 2410s """Called by subclasses to create a new ConnectionRecord.""" 2410s 2410s > return _ConnectionRecord(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s pool = , connect = True 2410s 2410s def __init__(self, pool: Pool, connect: bool = True): 2410s self.fresh = False 2410s self.fairy_ref = None 2410s self.starttime = 0 2410s self.dbapi_connection = None 2410s 2410s self.__pool = pool 2410s if connect: 2410s > self.__connect() 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s self.dbapi_connection = connection = pool._invoke_creator(self) 2410s pool.logger.debug("Created new connection %r", connection) 2410s self.fresh = True 2410s except BaseException as e: 2410s > with util.safe_reraise(): 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s type_ = None, value = None, traceback = None 2410s 2410s def __exit__( 2410s self, 2410s type_: Optional[Type[BaseException]], 2410s value: Optional[BaseException], 2410s traceback: Optional[types.TracebackType], 2410s ) -> NoReturn: 2410s assert self._exc_info is not None 2410s # see #2703 for notes 2410s if type_ is None: 2410s exc_type, exc_value, exc_tb = self._exc_info 2410s assert exc_value is not None 2410s self._exc_info = None # remove potential circular references 2410s > raise exc_value.with_traceback(exc_tb) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s 2410s def __connect(self) -> None: 2410s pool = self.__pool 2410s 2410s # ensure any existing connection is removed, so that if 2410s # creator fails, this attribute stays None 2410s self.dbapi_connection = None 2410s try: 2410s self.starttime = time.time() 2410s > self.dbapi_connection = connection = pool._invoke_creator(self) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s connection_record = 2410s 2410s def connect( 2410s connection_record: Optional[ConnectionPoolEntry] = None, 2410s ) -> DBAPIConnection: 2410s if dialect._has_events: 2410s for fn in dialect.dispatch.do_connect: 2410s connection = cast( 2410s DBAPIConnection, 2410s fn(dialect, connection_record, cargs, cparams), 2410s ) 2410s if connection is not None: 2410s return connection 2410s 2410s > return dialect.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s cargs = () 2410s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s 2410s def connect(self, *cargs, **cparams): 2410s # inherits the docstring from interfaces.Dialect.connect 2410s > return self.loaded_dbapi.connect(*cargs, **cparams) 2410s 2410s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2410s connection_factory = None, cursor_factory = None 2410s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2410s kwasync = {} 2410s 2410s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2410s """ 2410s Create a new database connection. 2410s 2410s The connection parameters can be specified as a string: 2410s 2410s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2410s 2410s or using a set of keyword arguments: 2410s 2410s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2410s 2410s Or as a mix of both. The basic connection parameters are: 2410s 2410s - *dbname*: the database name 2410s - *database*: the database name (only as keyword argument) 2410s - *user*: user name used to authenticate 2410s - *password*: password used to authenticate 2410s - *host*: database host address (defaults to UNIX socket if not provided) 2410s - *port*: connection port number (defaults to 5432 if not provided) 2410s 2410s Using the *connection_factory* parameter a different class or connections 2410s factory can be specified. It should be a callable object taking a dsn 2410s argument. 2410s 2410s Using the *cursor_factory* parameter, a new default cursor factory will be 2410s used by cursor(). 2410s 2410s Using *async*=True an asynchronous connection will be created. *async_* is 2410s a valid alias (for Python versions where ``async`` is a keyword). 2410s 2410s Any other keyword parameter will be passed to the underlying client 2410s library: the list of supported parameters depends on the library version. 2410s 2410s """ 2410s kwasync = {} 2410s if 'async' in kwargs: 2410s kwasync['async'] = kwargs.pop('async') 2410s if 'async_' in kwargs: 2410s kwasync['async_'] = kwargs.pop('async_') 2410s 2410s dsn = _ext.make_dsn(dsn, **kwargs) 2410s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2410s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2410s E Is the server running on that host and accepting TCP/IP connections? 2410s E 2410s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2410s 2410s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2410s __________________ test_api_to_sql_fail[mysql_pymysql_engine] __________________ 2410s conn = 'mysql_pymysql_engine' 2410s request = > 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_api_to_sql_fail(conn, request, test_frame1): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea49aff0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea49aff0> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2410s 2410s import sys 2410s 2410s from .constants import FIELD_TYPE 2410s from .err import ( 2410s Warning, 2410s Error, 2410s InterfaceError, 2410s DataError, 2410s DatabaseError, 2410s OperationalError, 2410s IntegrityError, 2410s InternalError, 2410s NotSupportedError, 2410s ProgrammingError, 2410s MySQLError, 2410s ) 2410s from .times import ( 2410s Date, 2410s Time, 2410s Timestamp, 2410s DateFromTicks, 2410s TimeFromTicks, 2410s TimestampFromTicks, 2410s ) 2410s 2410s # PyMySQL version. 2410s # Used by setuptools and connection_attrs 2410s VERSION = (1, 1, 1, "final", 1) 2410s VERSION_STRING = "1.1.1" 2410s 2410s ### for mysqlclient compatibility 2410s ### Django checks mysqlclient version. 2410s version_info = (1, 4, 6, "final", 1) 2410s __version__ = "1.4.6" 2410s 2410s 2410s def get_client_info(): # for MySQLdb compatibility 2410s return __version__ 2410s 2410s 2410s def install_as_MySQLdb(): 2410s """ 2410s After this function is called, any application that imports MySQLdb 2410s will unwittingly actually use pymysql. 2410s """ 2410s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2410s 2410s 2410s # end of mysqlclient compatibility code 2410s 2410s threadsafety = 1 2410s apilevel = "2.0" 2410s paramstyle = "pyformat" 2410s 2410s > from . import connections # noqa: E402 2410s 2410s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # Python implementation of the MySQL client-server protocol 2410s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2410s # Error codes: 2410s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2410s import errno 2410s import os 2410s import socket 2410s import struct 2410s import sys 2410s import traceback 2410s import warnings 2410s 2410s > from . import _auth 2410s 2410s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s Implements auth methods 2410s """ 2410s 2410s from .err import OperationalError 2410s 2410s 2410s try: 2410s from cryptography.hazmat.backends import default_backend 2410s > from cryptography.hazmat.primitives import serialization, hashes 2410s 2410s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s > from cryptography.hazmat.primitives._serialization import ( 2410s BestAvailableEncryption, 2410s Encoding, 2410s KeySerializationEncryption, 2410s NoEncryption, 2410s ParameterFormat, 2410s PrivateFormat, 2410s PublicFormat, 2410s _KeySerializationEncryption, 2410s ) 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography import utils 2410s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s # This file is dual licensed under the terms of the Apache License, Version 2410s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2410s # for complete details. 2410s 2410s from __future__ import annotations 2410s 2410s import abc 2410s 2410s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2410s 2410s __all__ = [ 2410s "HashAlgorithm", 2410s "HashContext", 2410s "Hash", 2410s "ExtendableOutputFunction", 2410s "SHA1", 2410s "SHA512_224", 2410s "SHA512_256", 2410s "SHA224", 2410s "SHA256", 2410s "SHA384", 2410s "SHA512", 2410s "SHA3_224", 2410s "SHA3_256", 2410s "SHA3_384", 2410s "SHA3_512", 2410s "SHAKE128", 2410s "SHAKE256", 2410s "MD5", 2410s "BLAKE2b", 2410s "BLAKE2s", 2410s "SM3", 2410s ] 2410s 2410s 2410s class HashAlgorithm(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def name(self) -> str: 2410s """ 2410s A string naming this algorithm (e.g. "sha256", "md5"). 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def digest_size(self) -> int: 2410s """ 2410s The size of the resulting digest in bytes. 2410s """ 2410s 2410s @property 2410s @abc.abstractmethod 2410s def block_size(self) -> int | None: 2410s """ 2410s The internal block size of the hash function, or None if the hash 2410s function does not use blocks internally (e.g. SHA3). 2410s """ 2410s 2410s 2410s class HashContext(metaclass=abc.ABCMeta): 2410s @property 2410s @abc.abstractmethod 2410s def algorithm(self) -> HashAlgorithm: 2410s """ 2410s A HashAlgorithm that will be used by this context. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def update(self, data: bytes) -> None: 2410s """ 2410s Processes the provided bytes through the hash. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def finalize(self) -> bytes: 2410s """ 2410s Finalizes the hash context and returns the hash digest as bytes. 2410s """ 2410s 2410s @abc.abstractmethod 2410s def copy(self) -> HashContext: 2410s """ 2410s Return a HashContext that is a copy of the current context. 2410s """ 2410s 2410s 2410s > Hash = rust_openssl.hashes.Hash 2410s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2410s 2410s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2410s ___________________ test_api_to_sql_fail[mysql_pymysql_conn] ___________________ 2410s conn = 'mysql_pymysql_conn' 2410s request = > 2410s test_frame1 = index A B C D 2410s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2410s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2410s 2410s @pytest.mark.parametrize("conn", all_connectable) 2410s def test_api_to_sql_fail(conn, request, test_frame1): 2410s > conn = request.getfixturevalue(conn) 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1636: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def getfixturevalue(self, argname: str) -> Any: 2410s """Dynamically run a named fixture function. 2410s 2410s Declaring fixtures via function argument is recommended where possible. 2410s But if you can only decide whether to use another fixture at test 2410s setup time, you may use this function to retrieve it inside a fixture 2410s or test function body. 2410s 2410s This method can be used during the test setup phase or the test run 2410s phase, but during the test teardown phase a fixture's value may not 2410s be available. 2410s 2410s :param argname: 2410s The fixture name. 2410s :raises pytest.FixtureLookupError: 2410s If the given fixture could not be found. 2410s """ 2410s # Note that in addition to the use case described in the docstring, 2410s # getfixturevalue() is also called by pytest itself during item and fixture 2410s # setup to evaluate the fixtures that are requested statically 2410s # (using function parameters, autouse, etc). 2410s 2410s > fixturedef = self._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_conn' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s > fixturedef = request._get_active_fixturedef(argname) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = > 2410s argname = 'mysql_pymysql_engine' 2410s 2410s def _get_active_fixturedef( 2410s self, argname: str 2410s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2410s if argname == "request": 2410s cached_result = (self, [0], None) 2410s return PseudoFixtureDef(cached_result, Scope.Function) 2410s 2410s # If we already finished computing a fixture by this name in this item, 2410s # return it. 2410s fixturedef = self._fixture_defs.get(argname) 2410s if fixturedef is not None: 2410s self._check_scope(fixturedef, fixturedef._scope) 2410s return fixturedef 2410s 2410s # Find the appropriate fixturedef. 2410s fixturedefs = self._arg2fixturedefs.get(argname, None) 2410s if fixturedefs is None: 2410s # We arrive here because of a dynamic call to 2410s # getfixturevalue(argname) which was naturally 2410s # not known at parsing/collection time. 2410s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2410s if fixturedefs is not None: 2410s self._arg2fixturedefs[argname] = fixturedefs 2410s # No fixtures defined with this name. 2410s if fixturedefs is None: 2410s raise FixtureLookupError(argname, self) 2410s # The are no fixtures with this name applicable for the function. 2410s if not fixturedefs: 2410s raise FixtureLookupError(argname, self) 2410s # A fixture may override another fixture with the same name, e.g. a 2410s # fixture in a module can override a fixture in a conftest, a fixture in 2410s # a class can override a fixture in the module, and so on. 2410s # An overriding fixture can request its own name (possibly indirectly); 2410s # in this case it gets the value of the fixture it overrides, one level 2410s # up. 2410s # Check how many `argname`s deep we are, and take the next one. 2410s # `fixturedefs` is sorted from furthest to closest, so use negative 2410s # indexing to go in reverse. 2410s index = -1 2410s for request in self._iter_chain(): 2410s if request.fixturename == argname: 2410s index -= 1 2410s # If already consumed all of the available levels, fail. 2410s if -index > len(fixturedefs): 2410s raise FixtureLookupError(argname, self) 2410s fixturedef = fixturedefs[index] 2410s 2410s # Prepare a SubRequest object for calling the fixture. 2410s try: 2410s callspec = self._pyfuncitem.callspec 2410s except AttributeError: 2410s callspec = None 2410s if callspec is not None and argname in callspec.params: 2410s param = callspec.params[argname] 2410s param_index = callspec.indices[argname] 2410s # The parametrize invocation scope overrides the fixture's scope. 2410s scope = callspec._arg2scope[argname] 2410s else: 2410s param = NOTSET 2410s param_index = 0 2410s scope = fixturedef._scope 2410s self._check_fixturedef_without_param(fixturedef) 2410s self._check_scope(fixturedef, scope) 2410s subrequest = SubRequest( 2410s self, scope, param, param_index, fixturedef, _ispytest=True 2410s ) 2410s 2410s # Make sure the fixture value is cached, running it if it isn't 2410s > fixturedef.execute(request=subrequest) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s request = > 2410s 2410s def execute(self, request: SubRequest) -> FixtureValue: 2410s """Return the value of this fixture, executing it if not cached.""" 2410s # Ensure that the dependent fixtures requested by this fixture are loaded. 2410s # This needs to be done before checking if we have a cached value, since 2410s # if a dependent fixture has their cache invalidated, e.g. due to 2410s # parametrization, they finalize themselves and fixtures depending on it 2410s # (which will likely include this fixture) setting `self.cached_result = None`. 2410s # See #4871 2410s requested_fixtures_that_should_finalize_us = [] 2410s for argname in self.argnames: 2410s fixturedef = request._get_active_fixturedef(argname) 2410s # Saves requested fixtures in a list so we later can add our finalizer 2410s # to them, ensuring that if a requested fixture gets torn down we get torn 2410s # down first. This is generally handled by SetupState, but still currently 2410s # needed when this fixture is not parametrized but depends on a parametrized 2410s # fixture. 2410s if not isinstance(fixturedef, PseudoFixtureDef): 2410s requested_fixtures_that_should_finalize_us.append(fixturedef) 2410s 2410s # Check for (and return) cached value/exception. 2410s if self.cached_result is not None: 2410s request_cache_key = self.cache_key(request) 2410s cache_key = self.cached_result[1] 2410s try: 2410s # Attempt to make a normal == check: this might fail for objects 2410s # which do not implement the standard comparison (like numpy arrays -- #6497). 2410s cache_hit = bool(request_cache_key == cache_key) 2410s except (ValueError, RuntimeError): 2410s # If the comparison raises, use 'is' as fallback. 2410s cache_hit = request_cache_key is cache_key 2410s 2410s if cache_hit: 2410s if self.cached_result[2] is not None: 2410s exc, exc_tb = self.cached_result[2] 2410s raise exc.with_traceback(exc_tb) 2410s else: 2410s result = self.cached_result[0] 2410s return result 2410s # We have a previous but differently parametrized fixture instance 2410s # so we need to tear it down before creating a new one. 2410s self.finish(request) 2410s assert self.cached_result is None 2410s 2410s # Add finalizer to requested fixtures we saved previously. 2410s # We make sure to do this after checking for cached value to avoid 2410s # adding our finalizer multiple times. (#12135) 2410s finalizer = functools.partial(self.finish, request=request) 2410s for parent_fixture in requested_fixtures_that_should_finalize_us: 2410s parent_fixture.addfinalizer(finalizer) 2410s 2410s ihook = request.node.ihook 2410s try: 2410s # Setup the fixture, run the code in it, and cache the value 2410s # in self.cached_result 2410s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def __call__(self, **kwargs: object) -> Any: 2410s """Call the hook. 2410s 2410s Only accepts keyword arguments, which should match the hook 2410s specification. 2410s 2410s Returns the result(s) of calling all registered plugins, see 2410s :ref:`calling`. 2410s """ 2410s assert ( 2410s not self.is_historic() 2410s ), "Cannot directly call a historic hook - use call_historic instead." 2410s self._verify_all_args_are_provided(kwargs) 2410s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2410s # Copy because plugins may register other plugins during iteration (#438). 2410s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2410s hook_name = 'pytest_fixture_setup' 2410s methods = [>] 2410s kwargs = {'fixturedef': , 'request': >} 2410s firstresult = True 2410s 2410s def _hookexec( 2410s self, 2410s hook_name: str, 2410s methods: Sequence[HookImpl], 2410s kwargs: Mapping[str, object], 2410s firstresult: bool, 2410s ) -> object | list[object]: 2410s # called from all hookcaller instances. 2410s # enable_tracing will set its own wrapping function at self._inner_hookexec 2410s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2410s 2410s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s @pytest.hookimpl(wrapper=True) 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[object], request: SubRequest 2410s ) -> Generator[None, object, object]: 2410s try: 2410s > return (yield) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturedef = 2410s request = > 2410s 2410s def pytest_fixture_setup( 2410s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2410s ) -> FixtureValue: 2410s """Execution of fixture setup.""" 2410s kwargs = {} 2410s for argname in fixturedef.argnames: 2410s kwargs[argname] = request.getfixturevalue(argname) 2410s 2410s fixturefunc = resolve_fixture_function(fixturedef, request) 2410s my_cache_key = fixturedef.cache_key(request) 2410s try: 2410s > result = call_fixture_func(fixturefunc, request, kwargs) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s fixturefunc = 2410s request = > 2410s kwargs = {} 2410s 2410s def call_fixture_func( 2410s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2410s ) -> FixtureValue: 2410s if is_generator(fixturefunc): 2410s fixturefunc = cast( 2410s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2410s ) 2410s generator = fixturefunc(**kwargs) 2410s try: 2410s > fixture_result = next(generator) 2410s 2410s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s @pytest.fixture 2410s def mysql_pymysql_engine(): 2410s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2410s > pymysql = td.versioned_importorskip("pymysql") 2410s 2410s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s args = ('pymysql',), kwargs = {} 2410s 2410s def versioned_importorskip(*args, **kwargs): 2410s """ 2410s (warning - this is currently Debian-specific, the name may change if upstream request this) 2410s 2410s Return the requested module, or skip the test if it is 2410s not available in a new enough version. 2410s 2410s Intended as a replacement for pytest.importorskip that 2410s defaults to requiring at least pandas' minimum version for that 2410s optional dependency, rather than any version. 2410s 2410s See import_optional_dependency for full parameter documentation. 2410s """ 2410s try: 2410s > module = import_optional_dependency(*args, **kwargs) 2410s 2410s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2410s 2410s def import_optional_dependency( 2410s name: str, 2410s extra: str = "", 2410s errors: str = "raise", 2410s min_version: str | None = None, 2410s ): 2410s """ 2410s Import an optional dependency. 2410s 2410s By default, if a dependency is missing an ImportError with a nice 2410s message will be raised. If a dependency is present, but too old, 2410s we raise. 2410s 2410s Parameters 2410s ---------- 2410s name : str 2410s The module name. 2410s extra : str 2410s Additional text to include in the ImportError message. 2410s errors : str {'raise', 'warn', 'ignore'} 2410s What to do when a dependency is not found or its version is too old. 2410s 2410s * raise : Raise an ImportError 2410s * warn : Only applicable when a module's version is to old. 2410s Warns that the version is too old and returns None 2410s * ignore: If the module is not installed, return None, otherwise, 2410s return the module, even if the version is too old. 2410s It's expected that users validate the version locally when 2410s using ``errors="ignore"`` (see. ``io/html.py``) 2410s min_version : str, default None 2410s Specify a minimum version that is different from the global pandas 2410s minimum version required. 2410s Returns 2410s ------- 2410s maybe_module : Optional[ModuleType] 2410s The imported module, when found and the version is correct. 2410s None is returned when the package is not found and `errors` 2410s is False, or when the package's version is too old and `errors` 2410s is ``'warn'`` or ``'ignore'``. 2410s """ 2410s assert errors in {"warn", "raise", "ignore"} 2410s if name=='numba' and warn_numba_platform: 2410s warnings.warn(warn_numba_platform) 2410s 2410s package_name = INSTALL_MAPPING.get(name) 2410s install_name = package_name if package_name is not None else name 2410s 2410s msg = ( 2410s f"Missing optional dependency '{install_name}'. {extra} " 2410s f"Use pip or conda to install {install_name}." 2410s ) 2410s try: 2410s > module = importlib.import_module(name) 2410s 2410s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None 2410s 2410s def import_module(name, package=None): 2410s """Import a module. 2410s 2410s The 'package' argument is required when performing a relative import. It 2410s specifies the package to use as the anchor point from which to resolve the 2410s relative import to an absolute import. 2410s 2410s """ 2410s level = 0 2410s if name.startswith('.'): 2410s if not package: 2410s raise TypeError("the 'package' argument is required to perform a " 2410s f"relative import for {name!r}") 2410s for character in name: 2410s if character != '.': 2410s break 2410s level += 1 2410s > return _bootstrap._gcd_import(name[level:], package, level) 2410s 2410s /usr/lib/python3.13/importlib/__init__.py:88: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', package = None, level = 0 2410s 2410s > ??? 2410s 2410s :1387: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1360: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s name = 'pymysql', import_ = 2410s 2410s > ??? 2410s 2410s :1331: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea49b050>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2410s 2410s > ??? 2410s 2410s :935: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea49b050> 2410s module = 2410s 2410s > ??? 2410s 2410s :1022: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s f = 2410s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2410s kwds = {} 2410s 2410s > ??? 2410s 2410s :488: 2410s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2410s 2410s """ 2410s PyMySQL: A pure-Python MySQL client library. 2410s 2410s Copyright (c) 2010-2016 PyMySQL contributors 2410s 2410s Permission is hereby granted, free of charge, to any person obtaining a copy 2410s of this software and associated documentation files (the "Software"), to deal 2410s in the Software without restriction, including without limitation the rights 2410s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2410s copies of the Software, and to permit persons to whom the Software is 2410s furnished to do so, subject to the following conditions: 2410s 2410s The above copyright notice and this permission notice shall be included in 2410s all copies or substantial portions of the Software. 2410s 2410s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2410s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2410s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2410s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2410s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2410s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2410s THE SOFTWARE. 2410s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _______________ test_api_to_sql_fail[postgresql_psycopg2_engine] _______________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_fail(conn, request, test_frame1): 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_frame2", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1637: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_frame2' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ________________ test_api_to_sql_fail[postgresql_psycopg2_conn] ________________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_fail(conn, request, test_frame1): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ________________ test_api_to_sql_replace[mysql_pymysql_engine] _________________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_replace(conn, request, test_frame1): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1651: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea49bb30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea49bb30> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _________________ test_api_to_sql_replace[mysql_pymysql_conn] __________________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_replace(conn, request, test_frame1): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1651: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea49bc50>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea49bc50> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _____________ test_api_to_sql_replace[postgresql_psycopg2_engine] ______________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_replace(conn, request, test_frame1): 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_frame3", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1652: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_frame3' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ______________ test_api_to_sql_replace[postgresql_psycopg2_conn] _______________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_replace(conn, request, test_frame1): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1651: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _________________ test_api_to_sql_append[mysql_pymysql_engine] _________________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_append(conn, request, test_frame1): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1669: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2406b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2406b0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s __________________ test_api_to_sql_append[mysql_pymysql_conn] __________________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_append(conn, request, test_frame1): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1669: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2407d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2407d0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ______________ test_api_to_sql_append[postgresql_psycopg2_engine] ______________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_append(conn, request, test_frame1): 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_frame4", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1670: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_frame4' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _______________ test_api_to_sql_append[postgresql_psycopg2_conn] _______________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_append(conn, request, test_frame1): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1669: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ______________ test_api_to_sql_type_mapping[mysql_pymysql_engine] ______________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s test_frame3 = index A B 2411s 0 2000-01-03 00:00:00 2147483647 -1.987670 2411s 1 2000-01-04 00:00:00 -29 -0.041232 2411s 2 2000-01-05 00:00:00 20000 0.731168 2411s 3 2000-01-06 00:00:00 -290867 1.567621 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_type_mapping(conn, request, test_frame3): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1688: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb240e90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb240e90> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _______________ test_api_to_sql_type_mapping[mysql_pymysql_conn] _______________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s test_frame3 = index A B 2411s 0 2000-01-03 00:00:00 2147483647 -1.987670 2411s 1 2000-01-04 00:00:00 -29 -0.041232 2411s 2 2000-01-05 00:00:00 20000 0.731168 2411s 3 2000-01-06 00:00:00 -290867 1.567621 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_type_mapping(conn, request, test_frame3): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1688: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb240fb0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb240fb0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ___________ test_api_to_sql_type_mapping[postgresql_psycopg2_engine] ___________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s test_frame3 = index A B 2411s 0 2000-01-03 00:00:00 2147483647 -1.987670 2411s 1 2000-01-04 00:00:00 -29 -0.041232 2411s 2 2000-01-05 00:00:00 20000 0.731168 2411s 3 2000-01-06 00:00:00 -290867 1.567621 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_type_mapping(conn, request, test_frame3): 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_frame5", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1689: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_frame5' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ____________ test_api_to_sql_type_mapping[postgresql_psycopg2_conn] ____________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s test_frame3 = index A B 2411s 0 2000-01-03 00:00:00 2147483647 -1.987670 2411s 1 2000-01-04 00:00:00 -29 -0.041232 2411s 2 2000-01-05 00:00:00 20000 0.731168 2411s 3 2000-01-06 00:00:00 -290867 1.567621 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_type_mapping(conn, request, test_frame3): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1688: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _________________ test_api_to_sql_series[mysql_pymysql_engine] _________________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_series(conn, request): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1701: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2417f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2417f0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s __________________ test_api_to_sql_series[mysql_pymysql_conn] __________________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_series(conn, request): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1701: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb241910>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb241910> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ______________ test_api_to_sql_series[postgresql_psycopg2_engine] ______________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_series(conn, request): 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_series", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1702: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_series' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _______________ test_api_to_sql_series[postgresql_psycopg2_conn] _______________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_series(conn, request): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1701: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ___________________ test_api_roundtrip[mysql_pymysql_engine] ___________________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_roundtrip(conn, request, test_frame1): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1715: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb242750>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb242750> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ____________________ test_api_roundtrip[mysql_pymysql_conn] ____________________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_roundtrip(conn, request, test_frame1): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1715: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb242990>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb242990> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ________________ test_api_roundtrip[postgresql_psycopg2_engine] ________________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_roundtrip(conn, request, test_frame1): 2411s conn_name = conn 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_frame_roundtrip", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1716: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_frame_roundtrip' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _________________ test_api_roundtrip[postgresql_psycopg2_conn] _________________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_roundtrip(conn, request, test_frame1): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1715: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ______________ test_api_roundtrip_chunksize[mysql_pymysql_engine] ______________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_roundtrip_chunksize(conn, request, test_frame1): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1739: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb243350>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb243350> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _______________ test_api_roundtrip_chunksize[mysql_pymysql_conn] _______________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_roundtrip_chunksize(conn, request, test_frame1): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1739: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb243470>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb243470> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ___________ test_api_roundtrip_chunksize[postgresql_psycopg2_engine] ___________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_roundtrip_chunksize(conn, request, test_frame1): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2411s ) 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_frame_roundtrip", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1740: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_frame_roundtrip' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ____________ test_api_roundtrip_chunksize[postgresql_psycopg2_conn] ____________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_roundtrip_chunksize(conn, request, test_frame1): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1739: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _______________ test_api_execute_sql[mysql_pymysql_engine_iris] ________________ 2411s conn = 'mysql_pymysql_engine_iris' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_iris) 2411s def test_api_execute_sql(conn, request): 2411s # drop_sql = "DROP TABLE IF EXISTS test" # should already be done 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1758: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_iris' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_iris' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb243950>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb243950> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ________________ test_api_execute_sql[mysql_pymysql_conn_iris] _________________ 2411s conn = 'mysql_pymysql_conn_iris' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_iris) 2411s def test_api_execute_sql(conn, request): 2411s # drop_sql = "DROP TABLE IF EXISTS test" # should already be done 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1758: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_iris' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_iris' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_iris' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb243a70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb243a70> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ____________ test_api_execute_sql[postgresql_psycopg2_engine_iris] _____________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_iris' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_iris) 2411s def test_api_execute_sql(conn, request): 2411s # drop_sql = "DROP TABLE IF EXISTS test" # should already be done 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1758: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_iris' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_iris' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2411s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2411s 2411s def create_and_load_iris(conn, iris_file: Path): 2411s from sqlalchemy import insert 2411s 2411s iris = iris_table_metadata() 2411s 2411s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2411s reader = csv.reader(csvfile) 2411s header = next(reader) 2411s params = [dict(zip(header, row)) for row in reader] 2411s stmt = insert(iris).values(params) 2411s > with conn.begin() as con: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __enter__(self): 2411s # do not keep args and kwds alive unnecessarily 2411s # they are only needed for recreation, which is not possible anymore 2411s del self.args, self.kwds, self.func 2411s try: 2411s > return next(self.gen) 2411s 2411s /usr/lib/python3.13/contextlib.py:141: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @contextlib.contextmanager 2411s def begin(self) -> Iterator[Connection]: 2411s """Return a context manager delivering a :class:`_engine.Connection` 2411s with a :class:`.Transaction` established. 2411s 2411s E.g.:: 2411s 2411s with engine.begin() as conn: 2411s conn.execute( 2411s text("insert into table (x, y, z) values (1, 2, 3)") 2411s ) 2411s conn.execute(text("my_special_procedure(5)")) 2411s 2411s Upon successful operation, the :class:`.Transaction` 2411s is committed. If an error is raised, the :class:`.Transaction` 2411s is rolled back. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.connect` - procure a 2411s :class:`_engine.Connection` from 2411s an :class:`_engine.Engine`. 2411s 2411s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2411s for a particular :class:`_engine.Connection`. 2411s 2411s """ 2411s > with self.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _____________ test_api_execute_sql[postgresql_psycopg2_conn_iris] ______________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_iris' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_iris) 2411s def test_api_execute_sql(conn, request): 2411s # drop_sql = "DROP TABLE IF EXISTS test" # should already be done 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1758: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_iris' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_iris' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_iris' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2411s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2411s 2411s def create_and_load_iris(conn, iris_file: Path): 2411s from sqlalchemy import insert 2411s 2411s iris = iris_table_metadata() 2411s 2411s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2411s reader = csv.reader(csvfile) 2411s header = next(reader) 2411s params = [dict(zip(header, row)) for row in reader] 2411s stmt = insert(iris).values(params) 2411s > with conn.begin() as con: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __enter__(self): 2411s # do not keep args and kwds alive unnecessarily 2411s # they are only needed for recreation, which is not possible anymore 2411s del self.args, self.kwds, self.func 2411s try: 2411s > return next(self.gen) 2411s 2411s /usr/lib/python3.13/contextlib.py:141: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @contextlib.contextmanager 2411s def begin(self) -> Iterator[Connection]: 2411s """Return a context manager delivering a :class:`_engine.Connection` 2411s with a :class:`.Transaction` established. 2411s 2411s E.g.:: 2411s 2411s with engine.begin() as conn: 2411s conn.execute( 2411s text("insert into table (x, y, z) values (1, 2, 3)") 2411s ) 2411s conn.execute(text("my_special_procedure(5)")) 2411s 2411s Upon successful operation, the :class:`.Transaction` 2411s is committed. If an error is raised, the :class:`.Transaction` 2411s is rolled back. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.connect` - procure a 2411s :class:`_engine.Connection` from 2411s an :class:`_engine.Engine`. 2411s 2411s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2411s for a particular :class:`_engine.Connection`. 2411s 2411s """ 2411s > with self.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ______________ test_api_date_parsing[mysql_pymysql_engine_types] _______________ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s def test_api_date_parsing(conn, request): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1769: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2644d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2644d0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _______________ test_api_date_parsing[mysql_pymysql_conn_types] ________________ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s def test_api_date_parsing(conn, request): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1769: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2645f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2645f0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ___________ test_api_date_parsing[postgresql_psycopg2_engine_types] ____________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s def test_api_date_parsing(conn, request): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1769: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ____________ test_api_date_parsing[postgresql_psycopg2_conn_types] _____________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s def test_api_date_parsing(conn, request): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1769: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...: >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...: >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-mysql_pymysql_engine_types] _ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s read_sql = , text = 'SELECT * FROM types' 2411s mode = ('sqlalchemy', 'fallback'), error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb265fd0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb265fd0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-mysql_pymysql_conn_types] _ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s read_sql = , text = 'SELECT * FROM types' 2411s mode = ('sqlalchemy', 'fallback'), error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb266090>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb266090> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-postgresql_psycopg2_engine_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s read_sql = , text = 'SELECT * FROM types' 2411s mode = ('sqlalchemy', 'fallback'), error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...nction test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-postgresql_psycopg2_engine_types]>>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...nction test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-postgresql_psycopg2_engine_types]>>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-postgresql_psycopg2_conn_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s read_sql = , text = 'SELECT * FROM types' 2411s mode = ('sqlalchemy', 'fallback'), error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...Function test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-postgresql_psycopg2_conn_types]>>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...Function test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-postgresql_psycopg2_conn_types]>>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-mysql_pymysql_engine_types] _ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s read_sql = , text = 'SELECT * FROM types' 2411s mode = ('sqlalchemy', 'fallback'), error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb266750>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb266750> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-mysql_pymysql_conn_types] _ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s read_sql = , text = 'SELECT * FROM types' 2411s mode = ('sqlalchemy', 'fallback'), error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb266810>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb266810> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-postgresql_psycopg2_engine_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s read_sql = , text = 'SELECT * FROM types' 2411s mode = ('sqlalchemy', 'fallback'), error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...unction test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-postgresql_psycopg2_engine_types]>>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...unction test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-postgresql_psycopg2_engine_types]>>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-postgresql_psycopg2_conn_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s read_sql = , text = 'SELECT * FROM types' 2411s mode = ('sqlalchemy', 'fallback'), error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-mysql_pymysql_engine_types] _ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s read_sql = , text = 'SELECT * FROM types' 2411s mode = ('sqlalchemy', 'fallback'), error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb266ed0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb266ed0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-mysql_pymysql_conn_types] _ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s read_sql = , text = 'SELECT * FROM types' 2411s mode = ('sqlalchemy', 'fallback'), error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb266f90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb266f90> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-postgresql_psycopg2_engine_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s read_sql = , text = 'SELECT * FROM types' 2411s mode = ('sqlalchemy', 'fallback'), error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...nction test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-postgresql_psycopg2_engine_types]>>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...nction test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-postgresql_psycopg2_engine_types]>>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-postgresql_psycopg2_conn_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s read_sql = , text = 'SELECT * FROM types' 2411s mode = ('sqlalchemy', 'fallback'), error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...Function test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-postgresql_psycopg2_conn_types]>>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...Function test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-postgresql_psycopg2_conn_types]>>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-ignore-mysql_pymysql_engine_types] _ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb267650>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb267650> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-ignore-mysql_pymysql_conn_types] _ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb267710>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb267710> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-ignore-postgresql_psycopg2_engine_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...' for >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...' for >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-ignore-postgresql_psycopg2_conn_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...es' for >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...es' for >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-raise-mysql_pymysql_engine_types] _ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb267cb0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb267cb0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-raise-mysql_pymysql_conn_types] _ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb267d70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb267d70> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-raise-postgresql_psycopg2_engine_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...s' for >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...s' for >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-raise-postgresql_psycopg2_conn_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...pes' for >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...pes' for >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-coerce-mysql_pymysql_engine_types] _ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb267950>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb267950> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-coerce-mysql_pymysql_conn_types] _ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb267a10>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb267a10> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-coerce-postgresql_psycopg2_engine_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...' for >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...' for >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-coerce-postgresql_psycopg2_conn_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...es' for >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...es' for >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-mysql_pymysql_engine_types] _ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s read_sql = 2411s text = 'SELECT * FROM types', mode = ('sqlalchemy', 'fallback') 2411s error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2ad3d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2ad3d0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-mysql_pymysql_conn_types] _ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s read_sql = 2411s text = 'SELECT * FROM types', mode = ('sqlalchemy', 'fallback') 2411s error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2ad490>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2ad490> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-postgresql_psycopg2_engine_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s read_sql = 2411s text = 'SELECT * FROM types', mode = ('sqlalchemy', 'fallback') 2411s error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ... test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-postgresql_psycopg2_engine_types]>>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ... test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-postgresql_psycopg2_engine_types]>>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-postgresql_psycopg2_conn_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s read_sql = 2411s text = 'SELECT * FROM types', mode = ('sqlalchemy', 'fallback') 2411s error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...on test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-postgresql_psycopg2_conn_types]>>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...on test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-postgresql_psycopg2_conn_types]>>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-mysql_pymysql_engine_types] _ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s read_sql = 2411s text = 'SELECT * FROM types', mode = ('sqlalchemy', 'fallback'), error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2adb50>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2adb50> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-mysql_pymysql_conn_types] _ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s read_sql = 2411s text = 'SELECT * FROM types', mode = ('sqlalchemy', 'fallback'), error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2adc10>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2adc10> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-postgresql_psycopg2_engine_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s read_sql = 2411s text = 'SELECT * FROM types', mode = ('sqlalchemy', 'fallback'), error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...n test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-postgresql_psycopg2_engine_types]>>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...n test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-postgresql_psycopg2_engine_types]>>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-postgresql_psycopg2_conn_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s read_sql = 2411s text = 'SELECT * FROM types', mode = ('sqlalchemy', 'fallback'), error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...ion test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-postgresql_psycopg2_conn_types]>>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...ion test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-postgresql_psycopg2_conn_types]>>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-mysql_pymysql_engine_types] _ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s read_sql = 2411s text = 'SELECT * FROM types', mode = ('sqlalchemy', 'fallback') 2411s error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2ae2d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2ae2d0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-mysql_pymysql_conn_types] _ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s read_sql = 2411s text = 'SELECT * FROM types', mode = ('sqlalchemy', 'fallback') 2411s error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2ae390>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2ae390> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-postgresql_psycopg2_engine_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s read_sql = 2411s text = 'SELECT * FROM types', mode = ('sqlalchemy', 'fallback') 2411s error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ... test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-postgresql_psycopg2_engine_types]>>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ... test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-postgresql_psycopg2_engine_types]>>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-postgresql_psycopg2_conn_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s read_sql = 2411s text = 'SELECT * FROM types', mode = ('sqlalchemy', 'fallback') 2411s error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...on test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-postgresql_psycopg2_conn_types]>>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...on test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-postgresql_psycopg2_conn_types]>>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-ignore-mysql_pymysql_engine_types] _ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2aea50>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2aea50> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-ignore-mysql_pymysql_conn_types] _ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2aeb10>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2aeb10> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-ignore-postgresql_psycopg2_engine_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-ignore-postgresql_psycopg2_conn_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'ignore' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...r >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...r >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-raise-mysql_pymysql_engine_types] _ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2af230>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2af230> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-raise-mysql_pymysql_conn_types] _ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2af2f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2af2f0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-raise-postgresql_psycopg2_engine_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ... >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ... >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-raise-postgresql_psycopg2_conn_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'raise' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...or >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...or >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-coerce-mysql_pymysql_engine_types] _ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2afad0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2afad0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-coerce-mysql_pymysql_conn_types] _ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2afb90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2afb90> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-coerce-postgresql_psycopg2_engine_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-coerce-postgresql_psycopg2_conn_types] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s read_sql = , text = 'types' 2411s mode = 'sqlalchemy', error = 'coerce' 2411s types_data_frame = TextCol DateCol ... IntColWithNull BoolColWithNull 2411s 0 first 2000-01-03 00:00:00 ... 1.0 0.0 2411s 1 first 2000-01-04 00:00:00 ... NaN NaN 2411s 2411s [2 rows x 9 columns] 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s @pytest.mark.parametrize("error", ["ignore", "raise", "coerce"]) 2411s @pytest.mark.parametrize( 2411s "read_sql, text, mode", 2411s [ 2411s (sql.read_sql, "SELECT * FROM types", ("sqlalchemy", "fallback")), 2411s (sql.read_sql, "types", ("sqlalchemy")), 2411s ( 2411s sql.read_sql_query, 2411s "SELECT * FROM types", 2411s ("sqlalchemy", "fallback"), 2411s ), 2411s (sql.read_sql_table, "types", ("sqlalchemy")), 2411s ], 2411s ) 2411s def test_api_custom_dateparsing_error( 2411s conn, request, read_sql, text, mode, error, types_data_frame 2411s ): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...r >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...r >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _____________ test_api_date_and_index[mysql_pymysql_engine_types] ______________ 2411s conn = 'mysql_pymysql_engine_types' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s def test_api_date_and_index(conn, request): 2411s # Test case where same column appears in parse_date and index_col 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1880: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2af2f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2af2f0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ______________ test_api_date_and_index[mysql_pymysql_conn_types] _______________ 2411s conn = 'mysql_pymysql_conn_types' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s def test_api_date_and_index(conn, request): 2411s # Test case where same column appears in parse_date and index_col 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1880: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2af410>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2af410> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s __________ test_api_date_and_index[postgresql_psycopg2_engine_types] ___________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_engine_types' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s def test_api_date_and_index(conn, request): 2411s # Test case where same column appears in parse_date and index_col 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1880: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...ubRequest 'postgresql_psycopg2_engine_types' for >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...ubRequest 'postgresql_psycopg2_engine_types' for >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ___________ test_api_date_and_index[postgresql_psycopg2_conn_types] ____________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn_types' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable_types) 2411s def test_api_date_and_index(conn, request): 2411s # Test case where same column appears in parse_date and index_col 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1880: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_engine_types' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'requ...>} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'requ...>} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2411s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2411s dialect = 'postgres' 2411s 2411s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2411s from sqlalchemy import insert 2411s from sqlalchemy.engine import Engine 2411s 2411s types = types_table_metadata(dialect) 2411s 2411s stmt = insert(types).values(types_data) 2411s if isinstance(conn, Engine): 2411s > with conn.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ___________________ test_api_timedelta[mysql_pymysql_engine] ___________________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_timedelta(conn, request): 2411s # see #6921 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2afe90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2afe90> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ____________________ test_api_timedelta[mysql_pymysql_conn] ____________________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_timedelta(conn, request): 2411s # see #6921 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2aff50>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb2aff50> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ________________ test_api_timedelta[postgresql_psycopg2_engine] ________________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_timedelta(conn, request): 2411s # see #6921 2411s conn_name = conn 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_timedelta", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1897: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_timedelta' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _________________ test_api_timedelta[postgresql_psycopg2_conn] _________________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_timedelta(conn, request): 2411s # see #6921 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ________________ test_api_complex_raises[mysql_pymysql_engine] _________________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_complex_raises(conn, request): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1942: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb458950>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb458950> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _________________ test_api_complex_raises[mysql_pymysql_conn] __________________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_complex_raises(conn, request): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1942: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb458a70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb458a70> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _____________ test_api_complex_raises[postgresql_psycopg2_engine] ______________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_complex_raises(conn, request): 2411s conn_name = conn 2411s conn = request.getfixturevalue(conn) 2411s df = DataFrame({"a": [1 + 1j, 2j]}) 2411s 2411s if "adbc" in conn_name: 2411s msg = "datatypes not supported" 2411s else: 2411s msg = "Complex datatypes not supported" 2411s with pytest.raises(ValueError, match=msg): 2411s > assert df.to_sql("test_complex", con=conn) is None 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1950: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ( a 2411s 0 1.0+1.0j 2411s 1 0.0+2.0j, 'test_complex') 2411s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s @wraps(func) 2411s def wrapper(*args, **kwargs): 2411s if len(args) > num_allow_args: 2411s warnings.warn( 2411s msg.format(arguments=_format_argument_list(allow_args)), 2411s FutureWarning, 2411s stacklevel=find_stack_level(), 2411s ) 2411s > return func(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = a 2411s 0 1.0+1.0j 2411s 1 0.0+2.0j, name = 'test_complex' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, if_exists = 'fail', index = True, index_label = None 2411s chunksize = None, dtype = None, method = None 2411s 2411s @final 2411s @deprecate_nonkeyword_arguments( 2411s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2411s ) 2411s def to_sql( 2411s self, 2411s name: str, 2411s con, 2411s schema: str | None = None, 2411s if_exists: Literal["fail", "replace", "append"] = "fail", 2411s index: bool_t = True, 2411s index_label: IndexLabel | None = None, 2411s chunksize: int | None = None, 2411s dtype: DtypeArg | None = None, 2411s method: Literal["multi"] | Callable | None = None, 2411s ) -> int | None: 2411s """ 2411s Write records stored in a DataFrame to a SQL database. 2411s 2411s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2411s newly created, appended to, or overwritten. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s Name of SQL table. 2411s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. Legacy support is provided for sqlite3.Connection objects. The user 2411s is responsible for engine disposal and connection closure for the SQLAlchemy 2411s connectable. See `here \ 2411s `_. 2411s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2411s the transaction will not be committed. If passing a sqlite3.Connection, 2411s it will not be possible to roll back the record insertion. 2411s 2411s schema : str, optional 2411s Specify the schema (if database flavor supports this). If None, use 2411s default schema. 2411s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2411s How to behave if the table already exists. 2411s 2411s * fail: Raise a ValueError. 2411s * replace: Drop the table before inserting new values. 2411s * append: Insert new values to the existing table. 2411s 2411s index : bool, default True 2411s Write DataFrame index as a column. Uses `index_label` as the column 2411s name in the table. Creates a table index for this column. 2411s index_label : str or sequence, default None 2411s Column label for index column(s). If None is given (default) and 2411s `index` is True, then the index names are used. 2411s A sequence should be given if the DataFrame uses MultiIndex. 2411s chunksize : int, optional 2411s Specify the number of rows in each batch to be written at a time. 2411s By default, all rows will be written at once. 2411s dtype : dict or scalar, optional 2411s Specifying the datatype for columns. If a dictionary is used, the 2411s keys should be the column names and the values should be the 2411s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2411s scalar is provided, it will be applied to all columns. 2411s method : {None, 'multi', callable}, optional 2411s Controls the SQL insertion clause used: 2411s 2411s * None : Uses standard SQL ``INSERT`` clause (one per row). 2411s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2411s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2411s 2411s Details and a sample callable implementation can be found in the 2411s section :ref:`insert method `. 2411s 2411s Returns 2411s ------- 2411s None or int 2411s Number of rows affected by to_sql. None is returned if the callable 2411s passed into ``method`` does not return an integer number of rows. 2411s 2411s The number of returned rows affected is the sum of the ``rowcount`` 2411s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2411s reflect the exact number of written rows as stipulated in the 2411s `sqlite3 `__ or 2411s `SQLAlchemy `__. 2411s 2411s .. versionadded:: 1.4.0 2411s 2411s Raises 2411s ------ 2411s ValueError 2411s When the table already exists and `if_exists` is 'fail' (the 2411s default). 2411s 2411s See Also 2411s -------- 2411s read_sql : Read a DataFrame from a table. 2411s 2411s Notes 2411s ----- 2411s Timezone aware datetime columns will be written as 2411s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2411s database. Otherwise, the datetimes will be stored as timezone unaware 2411s timestamps local to the original timezone. 2411s 2411s Not all datastores support ``method="multi"``. Oracle, for example, 2411s does not support multi-value insert. 2411s 2411s References 2411s ---------- 2411s .. [1] https://docs.sqlalchemy.org 2411s .. [2] https://www.python.org/dev/peps/pep-0249/ 2411s 2411s Examples 2411s -------- 2411s Create an in-memory SQLite database. 2411s 2411s >>> from sqlalchemy import create_engine 2411s >>> engine = create_engine('sqlite://', echo=False) 2411s 2411s Create a table from scratch with 3 rows. 2411s 2411s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2411s >>> df 2411s name 2411s 0 User 1 2411s 1 User 2 2411s 2 User 3 2411s 2411s >>> df.to_sql(name='users', con=engine) 2411s 3 2411s >>> from sqlalchemy import text 2411s >>> with engine.connect() as conn: 2411s ... conn.execute(text("SELECT * FROM users")).fetchall() 2411s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2411s 2411s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2411s 2411s >>> with engine.begin() as connection: 2411s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2411s ... df1.to_sql(name='users', con=connection, if_exists='append') 2411s 2 2411s 2411s This is allowed to support operations that require that the same 2411s DBAPI connection is used for the entire operation. 2411s 2411s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2411s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2411s 2 2411s >>> with engine.connect() as conn: 2411s ... conn.execute(text("SELECT * FROM users")).fetchall() 2411s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2411s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2411s (1, 'User 7')] 2411s 2411s Overwrite the table with just ``df2``. 2411s 2411s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2411s ... index_label='id') 2411s 2 2411s >>> with engine.connect() as conn: 2411s ... conn.execute(text("SELECT * FROM users")).fetchall() 2411s [(0, 'User 6'), (1, 'User 7')] 2411s 2411s Use ``method`` to define a callable insertion method to do nothing 2411s if there's a primary key conflict on a table in a PostgreSQL database. 2411s 2411s >>> from sqlalchemy.dialects.postgresql import insert 2411s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2411s ... # "a" is the primary key in "conflict_table" 2411s ... data = [dict(zip(keys, row)) for row in data_iter] 2411s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2411s ... result = conn.execute(stmt) 2411s ... return result.rowcount 2411s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2411s 0 2411s 2411s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2411s on a primary key. 2411s 2411s >>> from sqlalchemy.dialects.mysql import insert 2411s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2411s ... # update columns "b" and "c" on primary key conflict 2411s ... data = [dict(zip(keys, row)) for row in data_iter] 2411s ... stmt = ( 2411s ... insert(table.table) 2411s ... .values(data) 2411s ... ) 2411s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2411s ... result = conn.execute(stmt) 2411s ... return result.rowcount 2411s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2411s 2 2411s 2411s Specify the dtype (especially useful for integers with missing values). 2411s Notice that while pandas is forced to store the data as floating point, 2411s the database supports nullable integers. When fetching the data with 2411s Python, we get back integer scalars. 2411s 2411s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2411s >>> df 2411s A 2411s 0 1.0 2411s 1 NaN 2411s 2 2.0 2411s 2411s >>> from sqlalchemy.types import Integer 2411s >>> df.to_sql(name='integers', con=engine, index=False, 2411s ... dtype={"A": Integer()}) 2411s 3 2411s 2411s >>> with engine.connect() as conn: 2411s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2411s [(1,), (None,), (2,)] 2411s """ # noqa: E501 2411s from pandas.io import sql 2411s 2411s > return sql.to_sql( 2411s self, 2411s name, 2411s con, 2411s schema=schema, 2411s if_exists=if_exists, 2411s index=index, 2411s index_label=index_label, 2411s chunksize=chunksize, 2411s dtype=dtype, 2411s method=method, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s frame = a 2411s 0 1.0+1.0j 2411s 1 0.0+2.0j, name = 'test_complex' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, if_exists = 'fail', index = True, index_label = None 2411s chunksize = None, dtype = None, method = None, engine = 'auto' 2411s engine_kwargs = {} 2411s 2411s def to_sql( 2411s frame, 2411s name: str, 2411s con, 2411s schema: str | None = None, 2411s if_exists: Literal["fail", "replace", "append"] = "fail", 2411s index: bool = True, 2411s index_label: IndexLabel | None = None, 2411s chunksize: int | None = None, 2411s dtype: DtypeArg | None = None, 2411s method: Literal["multi"] | Callable | None = None, 2411s engine: str = "auto", 2411s **engine_kwargs, 2411s ) -> int | None: 2411s """ 2411s Write records stored in a DataFrame to a SQL database. 2411s 2411s Parameters 2411s ---------- 2411s frame : DataFrame, Series 2411s name : str 2411s Name of SQL table. 2411s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s or sqlite3 DBAPI2 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : str, optional 2411s Name of SQL schema in database to write to (if database flavor 2411s supports this). If None, use default schema (default). 2411s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2411s - fail: If table exists, do nothing. 2411s - replace: If table exists, drop it, recreate it, and insert data. 2411s - append: If table exists, insert data. Create if does not exist. 2411s index : bool, default True 2411s Write DataFrame index as a column. 2411s index_label : str or sequence, optional 2411s Column label for index column(s). If None is given (default) and 2411s `index` is True, then the index names are used. 2411s A sequence should be given if the DataFrame uses MultiIndex. 2411s chunksize : int, optional 2411s Specify the number of rows in each batch to be written at a time. 2411s By default, all rows will be written at once. 2411s dtype : dict or scalar, optional 2411s Specifying the datatype for columns. If a dictionary is used, the 2411s keys should be the column names and the values should be the 2411s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2411s scalar is provided, it will be applied to all columns. 2411s method : {None, 'multi', callable}, optional 2411s Controls the SQL insertion clause used: 2411s 2411s - None : Uses standard SQL ``INSERT`` clause (one per row). 2411s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2411s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2411s 2411s Details and a sample callable implementation can be found in the 2411s section :ref:`insert method `. 2411s engine : {'auto', 'sqlalchemy'}, default 'auto' 2411s SQL engine library to use. If 'auto', then the option 2411s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2411s behavior is 'sqlalchemy' 2411s 2411s .. versionadded:: 1.3.0 2411s 2411s **engine_kwargs 2411s Any additional kwargs are passed to the engine. 2411s 2411s Returns 2411s ------- 2411s None or int 2411s Number of rows affected by to_sql. None is returned if the callable 2411s passed into ``method`` does not return an integer number of rows. 2411s 2411s .. versionadded:: 1.4.0 2411s 2411s Notes 2411s ----- 2411s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2411s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2411s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2411s rows as stipulated in the 2411s `sqlite3 `__ or 2411s `SQLAlchemy `__ 2411s """ # noqa: E501 2411s if if_exists not in ("fail", "replace", "append"): 2411s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2411s 2411s if isinstance(frame, Series): 2411s frame = frame.to_frame() 2411s elif not isinstance(frame, DataFrame): 2411s raise NotImplementedError( 2411s "'frame' argument should be either a Series or a DataFrame" 2411s ) 2411s 2411s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = True 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = True 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ______________ test_api_complex_raises[postgresql_psycopg2_conn] _______________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_complex_raises(conn, request): 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1942: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ______ test_api_to_sql_index_label[None-None-index-mysql_pymysql_engine] _______ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s index_name = None, index_label = None, expected = 'index' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb459490>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb459490> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _______ test_api_to_sql_index_label[None-None-index-mysql_pymysql_conn] ________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s index_name = None, index_label = None, expected = 'index' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb459550>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb459550> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ___ test_api_to_sql_index_label[None-None-index-postgresql_psycopg2_engine] ____ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s index_name = None, index_label = None, expected = 'index' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_index_label", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1977: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_index_label' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ____ test_api_to_sql_index_label[None-None-index-postgresql_psycopg2_conn] _____ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s index_name = None, index_label = None, expected = 'index' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_to_sql_index_label[None-other_label-other_label-mysql_pymysql_engine] _ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s index_name = None, index_label = 'other_label', expected = 'other_label' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb459d30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb459d30> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_to_sql_index_label[None-other_label-other_label-mysql_pymysql_conn] _ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s index_name = None, index_label = 'other_label', expected = 'other_label' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb459e50>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb459e50> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_to_sql_index_label[None-other_label-other_label-postgresql_psycopg2_engine] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s index_name = None, index_label = 'other_label', expected = 'other_label' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_index_label", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1977: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_index_label' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_to_sql_index_label[None-other_label-other_label-postgresql_psycopg2_conn] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s index_name = None, index_label = 'other_label', expected = 'other_label' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_to_sql_index_label[index_name-None-index_name-mysql_pymysql_engine] _ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s index_name = 'index_name', index_label = None, expected = 'index_name' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb45a810>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb45a810> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s __ test_api_to_sql_index_label[index_name-None-index_name-mysql_pymysql_conn] __ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s index_name = 'index_name', index_label = None, expected = 'index_name' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb45a930>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb45a930> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_to_sql_index_label[index_name-None-index_name-postgresql_psycopg2_engine] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s index_name = 'index_name', index_label = None, expected = 'index_name' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_index_label", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1977: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_index_label' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_to_sql_index_label[index_name-None-index_name-postgresql_psycopg2_conn] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s index_name = 'index_name', index_label = None, expected = 'index_name' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_to_sql_index_label[index_name-other_label-other_label-mysql_pymysql_engine] _ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s index_name = 'index_name', index_label = 'other_label', expected = 'other_label' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb45b290>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb45b290> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_to_sql_index_label[index_name-other_label-other_label-mysql_pymysql_conn] _ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s index_name = 'index_name', index_label = 'other_label', expected = 'other_label' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb45b3b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb45b3b0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _ test_api_to_sql_index_label[index_name-other_label-other_label-postgresql_psycopg2_engine] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s index_name = 'index_name', index_label = 'other_label', expected = 'other_label' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_index_label", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1977: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_index_label' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _ test_api_to_sql_index_label[index_name-other_label-other_label-postgresql_psycopg2_conn] _ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s index_name = 'index_name', index_label = 'other_label', expected = 'other_label' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s __________ test_api_to_sql_index_label[0-None-0-mysql_pymysql_engine] __________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s index_name = 0, index_label = None, expected = '0' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb45bd10>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb45bd10> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ___________ test_api_to_sql_index_label[0-None-0-mysql_pymysql_conn] ___________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s index_name = 0, index_label = None, expected = '0' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb45bdd0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb45bdd0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _______ test_api_to_sql_index_label[0-None-0-postgresql_psycopg2_engine] _______ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s index_name = 0, index_label = None, expected = '0' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_index_label", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1977: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_index_label' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ________ test_api_to_sql_index_label[0-None-0-postgresql_psycopg2_conn] ________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s index_name = 0, index_label = None, expected = '0' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s __________ test_api_to_sql_index_label[None-0-0-mysql_pymysql_engine] __________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s index_name = None, index_label = 0, expected = '0' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b46b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b46b0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ___________ test_api_to_sql_index_label[None-0-0-mysql_pymysql_conn] ___________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s index_name = None, index_label = 0, expected = '0' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b4770>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b4770> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _______ test_api_to_sql_index_label[None-0-0-postgresql_psycopg2_engine] _______ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s index_name = None, index_label = 0, expected = '0' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_index_label", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1977: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_index_label' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ________ test_api_to_sql_index_label[None-0-0-postgresql_psycopg2_conn] ________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s index_name = None, index_label = 0, expected = '0' 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "index_name,index_label,expected", 2411s [ 2411s # no index name, defaults to 'index' 2411s (None, None, "index"), 2411s # specifying index_label 2411s (None, "other_label", "other_label"), 2411s # using the index name 2411s ("index_name", None, "index_name"), 2411s # has index name, but specifying index_label 2411s ("index_name", "other_label", "other_label"), 2411s # index name is integer 2411s (0, None, "0"), 2411s # index name is None but index label is integer 2411s (None, 0, "0"), 2411s ], 2411s ) 2411s def test_api_to_sql_index_label(conn, request, index_name, index_label, expected): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:1976: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ______ test_api_to_sql_index_label_multiindex[postgresql_psycopg2_engine] ______ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_index_label_multiindex(conn, request): 2411s conn_name = conn 2411s if "mysql" in conn_name: 2411s request.applymarker( 2411s pytest.mark.xfail( 2411s reason="MySQL can fail using TEXT without length as key", strict=False 2411s ) 2411s ) 2411s elif "adbc" in conn_name: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_index_label", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2004: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_index_label' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _______ test_api_to_sql_index_label_multiindex[postgresql_psycopg2_conn] _______ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_to_sql_index_label_multiindex(conn, request): 2411s conn_name = conn 2411s if "mysql" in conn_name: 2411s request.applymarker( 2411s pytest.mark.xfail( 2411s reason="MySQL can fail using TEXT without length as key", strict=False 2411s ) 2411s ) 2411s elif "adbc" in conn_name: 2411s request.node.add_marker( 2411s pytest.mark.xfail(reason="index_label argument NotImplemented with ADBC") 2411s ) 2411s 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2003: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _____________ test_api_multiindex_roundtrip[mysql_pymysql_engine] ______________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_multiindex_roundtrip(conn, request): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2065: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b59d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b59d0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ______________ test_api_multiindex_roundtrip[mysql_pymysql_conn] _______________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_multiindex_roundtrip(conn, request): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2065: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b5af0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b5af0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s __________ test_api_multiindex_roundtrip[postgresql_psycopg2_engine] ___________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_multiindex_roundtrip(conn, request): 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_multiindex_roundtrip", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2066: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_multiindex_roundtrip' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ___________ test_api_multiindex_roundtrip[postgresql_psycopg2_conn] ____________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_multiindex_roundtrip(conn, request): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2065: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ______________ test_api_dtype_argument[None-mysql_pymysql_engine] ______________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s dtype = None 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2096: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b64b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b64b0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _______________ test_api_dtype_argument[None-mysql_pymysql_conn] _______________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s dtype = None 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2096: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b65d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b65d0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ___________ test_api_dtype_argument[None-postgresql_psycopg2_engine] ___________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s dtype = None 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_dtype_argument", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2097: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_dtype_argument' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ____________ test_api_dtype_argument[None-postgresql_psycopg2_conn] ____________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s dtype = None 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2096: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ______________ test_api_dtype_argument[int-mysql_pymysql_engine] _______________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s dtype = 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2096: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b6f90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b6f90> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _______________ test_api_dtype_argument[int-mysql_pymysql_conn] ________________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s dtype = 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2096: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b70b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b70b0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ___________ test_api_dtype_argument[int-postgresql_psycopg2_engine] ____________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s dtype = 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_dtype_argument", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2097: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_dtype_argument' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ____________ test_api_dtype_argument[int-postgresql_psycopg2_conn] _____________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s dtype = 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2096: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _____________ test_api_dtype_argument[float-mysql_pymysql_engine] ______________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s dtype = 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2096: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b7ad0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b7ad0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ______________ test_api_dtype_argument[float-mysql_pymysql_conn] _______________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s dtype = 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2096: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b7bf0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b7bf0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s __________ test_api_dtype_argument[float-postgresql_psycopg2_engine] ___________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s dtype = 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_dtype_argument", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2097: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_dtype_argument' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ___________ test_api_dtype_argument[float-postgresql_psycopg2_conn] ____________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s dtype = 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2096: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _____________ test_api_dtype_argument[dtype3-mysql_pymysql_engine] _____________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s dtype = {'A': , 'B': } 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2096: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4805f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4805f0> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ______________ test_api_dtype_argument[dtype3-mysql_pymysql_conn] ______________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s dtype = {'A': , 'B': } 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2096: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb480710>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb480710> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s __________ test_api_dtype_argument[dtype3-postgresql_psycopg2_engine] __________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s dtype = {'A': , 'B': } 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s conn = request.getfixturevalue(conn) 2411s > if sql.has_table("test_dtype_argument", conn): 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2097: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s table_name = 'test_dtype_argument' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None 2411s 2411s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2411s """ 2411s Check if DataBase has named table. 2411s 2411s Parameters 2411s ---------- 2411s table_name: string 2411s Name of SQL table. 2411s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : string, default None 2411s Name of SQL schema in database to write to (if database flavor supports 2411s this). If None, use default schema (default). 2411s 2411s Returns 2411s ------- 2411s boolean 2411s """ 2411s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ___________ test_api_dtype_argument[dtype3-postgresql_psycopg2_conn] ___________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s dtype = {'A': , 'B': } 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s @pytest.mark.parametrize( 2411s "dtype", 2411s [ 2411s None, 2411s int, 2411s float, 2411s {"A": int, "B": float}, 2411s ], 2411s ) 2411s def test_api_dtype_argument(conn, request, dtype): 2411s # GH10285 Add dtype argument to read_sql_query 2411s conn_name = conn 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2096: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _______________ test_api_integer_col_names[mysql_pymysql_engine] _______________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_integer_col_names(conn, request): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2117: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb481070>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb481070> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ________________ test_api_integer_col_names[mysql_pymysql_conn] ________________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_integer_col_names(conn, request): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2117: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb481190>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb481190> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ____________ test_api_integer_col_names[postgresql_psycopg2_engine] ____________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_integer_col_names(conn, request): 2411s conn = request.getfixturevalue(conn) 2411s df = DataFrame([[1, 2], [3, 4]], columns=[0, 1]) 2411s > sql.to_sql(df, "test_frame_integer_col_names", conn, if_exists="replace") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2119: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s frame = 0 1 2411s 0 1 2 2411s 1 3 4, name = 'test_frame_integer_col_names' 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, if_exists = 'replace', index = True, index_label = None 2411s chunksize = None, dtype = None, method = None, engine = 'auto' 2411s engine_kwargs = {} 2411s 2411s def to_sql( 2411s frame, 2411s name: str, 2411s con, 2411s schema: str | None = None, 2411s if_exists: Literal["fail", "replace", "append"] = "fail", 2411s index: bool = True, 2411s index_label: IndexLabel | None = None, 2411s chunksize: int | None = None, 2411s dtype: DtypeArg | None = None, 2411s method: Literal["multi"] | Callable | None = None, 2411s engine: str = "auto", 2411s **engine_kwargs, 2411s ) -> int | None: 2411s """ 2411s Write records stored in a DataFrame to a SQL database. 2411s 2411s Parameters 2411s ---------- 2411s frame : DataFrame, Series 2411s name : str 2411s Name of SQL table. 2411s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2411s or sqlite3 DBAPI2 connection 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library. 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s schema : str, optional 2411s Name of SQL schema in database to write to (if database flavor 2411s supports this). If None, use default schema (default). 2411s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2411s - fail: If table exists, do nothing. 2411s - replace: If table exists, drop it, recreate it, and insert data. 2411s - append: If table exists, insert data. Create if does not exist. 2411s index : bool, default True 2411s Write DataFrame index as a column. 2411s index_label : str or sequence, optional 2411s Column label for index column(s). If None is given (default) and 2411s `index` is True, then the index names are used. 2411s A sequence should be given if the DataFrame uses MultiIndex. 2411s chunksize : int, optional 2411s Specify the number of rows in each batch to be written at a time. 2411s By default, all rows will be written at once. 2411s dtype : dict or scalar, optional 2411s Specifying the datatype for columns. If a dictionary is used, the 2411s keys should be the column names and the values should be the 2411s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2411s scalar is provided, it will be applied to all columns. 2411s method : {None, 'multi', callable}, optional 2411s Controls the SQL insertion clause used: 2411s 2411s - None : Uses standard SQL ``INSERT`` clause (one per row). 2411s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2411s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2411s 2411s Details and a sample callable implementation can be found in the 2411s section :ref:`insert method `. 2411s engine : {'auto', 'sqlalchemy'}, default 'auto' 2411s SQL engine library to use. If 'auto', then the option 2411s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2411s behavior is 'sqlalchemy' 2411s 2411s .. versionadded:: 1.3.0 2411s 2411s **engine_kwargs 2411s Any additional kwargs are passed to the engine. 2411s 2411s Returns 2411s ------- 2411s None or int 2411s Number of rows affected by to_sql. None is returned if the callable 2411s passed into ``method`` does not return an integer number of rows. 2411s 2411s .. versionadded:: 1.4.0 2411s 2411s Notes 2411s ----- 2411s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2411s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2411s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2411s rows as stipulated in the 2411s `sqlite3 `__ or 2411s `SQLAlchemy `__ 2411s """ # noqa: E501 2411s if if_exists not in ("fail", "replace", "append"): 2411s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2411s 2411s if isinstance(frame, Series): 2411s frame = frame.to_frame() 2411s elif not isinstance(frame, DataFrame): 2411s raise NotImplementedError( 2411s "'frame' argument should be either a Series or a DataFrame" 2411s ) 2411s 2411s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = True 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = True 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s _____________ test_api_integer_col_names[postgresql_psycopg2_conn] _____________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_integer_col_names(conn, request): 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2117: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s __________________ test_api_get_schema[mysql_pymysql_engine] ___________________ 2411s conn = 'mysql_pymysql_engine' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_get_schema(conn, request, test_frame1): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail( 2411s reason="'get_schema' not implemented for ADBC drivers", 2411s strict=True, 2411s ) 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2131: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb482090>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb482090> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s ___________________ test_api_get_schema[mysql_pymysql_conn] ____________________ 2411s conn = 'mysql_pymysql_conn' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_get_schema(conn, request, test_frame1): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail( 2411s reason="'get_schema' not implemented for ADBC drivers", 2411s strict=True, 2411s ) 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2131: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s > fixturedef = request._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'mysql_pymysql_engine' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s @pytest.fixture 2411s def mysql_pymysql_engine(): 2411s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2411s > pymysql = td.versioned_importorskip("pymysql") 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s args = ('pymysql',), kwargs = {} 2411s 2411s def versioned_importorskip(*args, **kwargs): 2411s """ 2411s (warning - this is currently Debian-specific, the name may change if upstream request this) 2411s 2411s Return the requested module, or skip the test if it is 2411s not available in a new enough version. 2411s 2411s Intended as a replacement for pytest.importorskip that 2411s defaults to requiring at least pandas' minimum version for that 2411s optional dependency, rather than any version. 2411s 2411s See import_optional_dependency for full parameter documentation. 2411s """ 2411s try: 2411s > module = import_optional_dependency(*args, **kwargs) 2411s 2411s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2411s 2411s def import_optional_dependency( 2411s name: str, 2411s extra: str = "", 2411s errors: str = "raise", 2411s min_version: str | None = None, 2411s ): 2411s """ 2411s Import an optional dependency. 2411s 2411s By default, if a dependency is missing an ImportError with a nice 2411s message will be raised. If a dependency is present, but too old, 2411s we raise. 2411s 2411s Parameters 2411s ---------- 2411s name : str 2411s The module name. 2411s extra : str 2411s Additional text to include in the ImportError message. 2411s errors : str {'raise', 'warn', 'ignore'} 2411s What to do when a dependency is not found or its version is too old. 2411s 2411s * raise : Raise an ImportError 2411s * warn : Only applicable when a module's version is to old. 2411s Warns that the version is too old and returns None 2411s * ignore: If the module is not installed, return None, otherwise, 2411s return the module, even if the version is too old. 2411s It's expected that users validate the version locally when 2411s using ``errors="ignore"`` (see. ``io/html.py``) 2411s min_version : str, default None 2411s Specify a minimum version that is different from the global pandas 2411s minimum version required. 2411s Returns 2411s ------- 2411s maybe_module : Optional[ModuleType] 2411s The imported module, when found and the version is correct. 2411s None is returned when the package is not found and `errors` 2411s is False, or when the package's version is too old and `errors` 2411s is ``'warn'`` or ``'ignore'``. 2411s """ 2411s assert errors in {"warn", "raise", "ignore"} 2411s if name=='numba' and warn_numba_platform: 2411s warnings.warn(warn_numba_platform) 2411s 2411s package_name = INSTALL_MAPPING.get(name) 2411s install_name = package_name if package_name is not None else name 2411s 2411s msg = ( 2411s f"Missing optional dependency '{install_name}'. {extra} " 2411s f"Use pip or conda to install {install_name}." 2411s ) 2411s try: 2411s > module = importlib.import_module(name) 2411s 2411s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None 2411s 2411s def import_module(name, package=None): 2411s """Import a module. 2411s 2411s The 'package' argument is required when performing a relative import. It 2411s specifies the package to use as the anchor point from which to resolve the 2411s relative import to an absolute import. 2411s 2411s """ 2411s level = 0 2411s if name.startswith('.'): 2411s if not package: 2411s raise TypeError("the 'package' argument is required to perform a " 2411s f"relative import for {name!r}") 2411s for character in name: 2411s if character != '.': 2411s break 2411s level += 1 2411s > return _bootstrap._gcd_import(name[level:], package, level) 2411s 2411s /usr/lib/python3.13/importlib/__init__.py:88: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', package = None, level = 0 2411s 2411s > ??? 2411s 2411s :1387: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1360: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s name = 'pymysql', import_ = 2411s 2411s > ??? 2411s 2411s :1331: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb482210>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2411s 2411s > ??? 2411s 2411s :935: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb482210> 2411s module = 2411s 2411s > ??? 2411s 2411s :1022: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s f = 2411s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2411s kwds = {} 2411s 2411s > ??? 2411s 2411s :488: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s PyMySQL: A pure-Python MySQL client library. 2411s 2411s Copyright (c) 2010-2016 PyMySQL contributors 2411s 2411s Permission is hereby granted, free of charge, to any person obtaining a copy 2411s of this software and associated documentation files (the "Software"), to deal 2411s in the Software without restriction, including without limitation the rights 2411s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2411s copies of the Software, and to permit persons to whom the Software is 2411s furnished to do so, subject to the following conditions: 2411s 2411s The above copyright notice and this permission notice shall be included in 2411s all copies or substantial portions of the Software. 2411s 2411s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2411s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2411s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2411s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2411s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2411s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2411s THE SOFTWARE. 2411s """ 2411s 2411s import sys 2411s 2411s from .constants import FIELD_TYPE 2411s from .err import ( 2411s Warning, 2411s Error, 2411s InterfaceError, 2411s DataError, 2411s DatabaseError, 2411s OperationalError, 2411s IntegrityError, 2411s InternalError, 2411s NotSupportedError, 2411s ProgrammingError, 2411s MySQLError, 2411s ) 2411s from .times import ( 2411s Date, 2411s Time, 2411s Timestamp, 2411s DateFromTicks, 2411s TimeFromTicks, 2411s TimestampFromTicks, 2411s ) 2411s 2411s # PyMySQL version. 2411s # Used by setuptools and connection_attrs 2411s VERSION = (1, 1, 1, "final", 1) 2411s VERSION_STRING = "1.1.1" 2411s 2411s ### for mysqlclient compatibility 2411s ### Django checks mysqlclient version. 2411s version_info = (1, 4, 6, "final", 1) 2411s __version__ = "1.4.6" 2411s 2411s 2411s def get_client_info(): # for MySQLdb compatibility 2411s return __version__ 2411s 2411s 2411s def install_as_MySQLdb(): 2411s """ 2411s After this function is called, any application that imports MySQLdb 2411s will unwittingly actually use pymysql. 2411s """ 2411s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2411s 2411s 2411s # end of mysqlclient compatibility code 2411s 2411s threadsafety = 1 2411s apilevel = "2.0" 2411s paramstyle = "pyformat" 2411s 2411s > from . import connections # noqa: E402 2411s 2411s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # Python implementation of the MySQL client-server protocol 2411s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2411s # Error codes: 2411s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2411s import errno 2411s import os 2411s import socket 2411s import struct 2411s import sys 2411s import traceback 2411s import warnings 2411s 2411s > from . import _auth 2411s 2411s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s """ 2411s Implements auth methods 2411s """ 2411s 2411s from .err import OperationalError 2411s 2411s 2411s try: 2411s from cryptography.hazmat.backends import default_backend 2411s > from cryptography.hazmat.primitives import serialization, hashes 2411s 2411s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s > from cryptography.hazmat.primitives._serialization import ( 2411s BestAvailableEncryption, 2411s Encoding, 2411s KeySerializationEncryption, 2411s NoEncryption, 2411s ParameterFormat, 2411s PrivateFormat, 2411s PublicFormat, 2411s _KeySerializationEncryption, 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography import utils 2411s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s # This file is dual licensed under the terms of the Apache License, Version 2411s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2411s # for complete details. 2411s 2411s from __future__ import annotations 2411s 2411s import abc 2411s 2411s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2411s 2411s __all__ = [ 2411s "HashAlgorithm", 2411s "HashContext", 2411s "Hash", 2411s "ExtendableOutputFunction", 2411s "SHA1", 2411s "SHA512_224", 2411s "SHA512_256", 2411s "SHA224", 2411s "SHA256", 2411s "SHA384", 2411s "SHA512", 2411s "SHA3_224", 2411s "SHA3_256", 2411s "SHA3_384", 2411s "SHA3_512", 2411s "SHAKE128", 2411s "SHAKE256", 2411s "MD5", 2411s "BLAKE2b", 2411s "BLAKE2s", 2411s "SM3", 2411s ] 2411s 2411s 2411s class HashAlgorithm(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def name(self) -> str: 2411s """ 2411s A string naming this algorithm (e.g. "sha256", "md5"). 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def digest_size(self) -> int: 2411s """ 2411s The size of the resulting digest in bytes. 2411s """ 2411s 2411s @property 2411s @abc.abstractmethod 2411s def block_size(self) -> int | None: 2411s """ 2411s The internal block size of the hash function, or None if the hash 2411s function does not use blocks internally (e.g. SHA3). 2411s """ 2411s 2411s 2411s class HashContext(metaclass=abc.ABCMeta): 2411s @property 2411s @abc.abstractmethod 2411s def algorithm(self) -> HashAlgorithm: 2411s """ 2411s A HashAlgorithm that will be used by this context. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def update(self, data: bytes) -> None: 2411s """ 2411s Processes the provided bytes through the hash. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def finalize(self) -> bytes: 2411s """ 2411s Finalizes the hash context and returns the hash digest as bytes. 2411s """ 2411s 2411s @abc.abstractmethod 2411s def copy(self) -> HashContext: 2411s """ 2411s Return a HashContext that is a copy of the current context. 2411s """ 2411s 2411s 2411s > Hash = rust_openssl.hashes.Hash 2411s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2411s 2411s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2411s _______________ test_api_get_schema[postgresql_psycopg2_engine] ________________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_get_schema(conn, request, test_frame1): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail( 2411s reason="'get_schema' not implemented for ADBC drivers", 2411s strict=True, 2411s ) 2411s ) 2411s conn = request.getfixturevalue(conn) 2411s > create_sql = sql.get_schema(test_frame1, "test", con=conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2132: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s frame = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s name = 'test', keys = None 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s dtype = None, schema = None 2411s 2411s def get_schema( 2411s frame, 2411s name: str, 2411s keys=None, 2411s con=None, 2411s dtype: DtypeArg | None = None, 2411s schema: str | None = None, 2411s ) -> str: 2411s """ 2411s Get the SQL db table schema for the given frame. 2411s 2411s Parameters 2411s ---------- 2411s frame : DataFrame 2411s name : str 2411s name of SQL table 2411s keys : string or sequence, default: None 2411s columns to use a primary key 2411s con: ADBC Connection, SQLAlchemy connectable, sqlite3 connection, default: None 2411s ADBC provides high performance I/O with native type support, where available. 2411s Using SQLAlchemy makes it possible to use any DB supported by that 2411s library 2411s If a DBAPI2 object, only sqlite3 is supported. 2411s dtype : dict of column name to SQL type, default None 2411s Optional specifying the datatype for columns. The SQL type should 2411s be a SQLAlchemy type, or a string for sqlite3 fallback connection. 2411s schema: str, default: None 2411s Optional specifying the schema to be used in creating the table. 2411s """ 2411s > with pandasSQL_builder(con=con) as pandas_sql: 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:2923: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def pandasSQL_builder( 2411s con, 2411s schema: str | None = None, 2411s need_transaction: bool = False, 2411s ) -> PandasSQL: 2411s """ 2411s Convenience function to return the correct PandasSQL subclass based on the 2411s provided parameters. Also creates a sqlalchemy connection and transaction 2411s if necessary. 2411s """ 2411s import sqlite3 2411s 2411s if isinstance(con, sqlite3.Connection) or con is None: 2411s return SQLiteDatabase(con) 2411s 2411s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2411s 2411s if isinstance(con, str) and sqlalchemy is None: 2411s raise ImportError("Using URI string without sqlalchemy installed.") 2411s 2411s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2411s > return SQLDatabase(con, schema, need_transaction) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s schema = None, need_transaction = False 2411s 2411s def __init__( 2411s self, con, schema: str | None = None, need_transaction: bool = False 2411s ) -> None: 2411s from sqlalchemy import create_engine 2411s from sqlalchemy.engine import Engine 2411s from sqlalchemy.schema import MetaData 2411s 2411s # self.exit_stack cleans up the Engine and Connection and commits the 2411s # transaction if any of those objects was created below. 2411s # Cleanup happens either in self.__exit__ or at the end of the iterator 2411s # returned by read_sql when chunksize is not None. 2411s self.exit_stack = ExitStack() 2411s if isinstance(con, str): 2411s con = create_engine(con) 2411s self.exit_stack.callback(con.dispose) 2411s if isinstance(con, Engine): 2411s > con = self.exit_stack.enter_context(con.connect()) 2411s 2411s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E 2411s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s ________________ test_api_get_schema[postgresql_psycopg2_conn] _________________ 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2411s Create a new database connection. 2411s 2411s The connection parameters can be specified as a string: 2411s 2411s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2411s 2411s or using a set of keyword arguments: 2411s 2411s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2411s 2411s Or as a mix of both. The basic connection parameters are: 2411s 2411s - *dbname*: the database name 2411s - *database*: the database name (only as keyword argument) 2411s - *user*: user name used to authenticate 2411s - *password*: password used to authenticate 2411s - *host*: database host address (defaults to UNIX socket if not provided) 2411s - *port*: connection port number (defaults to 5432 if not provided) 2411s 2411s Using the *connection_factory* parameter a different class or connections 2411s factory can be specified. It should be a callable object taking a dsn 2411s argument. 2411s 2411s Using the *cursor_factory* parameter, a new default cursor factory will be 2411s used by cursor(). 2411s 2411s Using *async*=True an asynchronous connection will be created. *async_* is 2411s a valid alias (for Python versions where ``async`` is a keyword). 2411s 2411s Any other keyword parameter will be passed to the underlying client 2411s library: the list of supported parameters depends on the library version. 2411s 2411s """ 2411s kwasync = {} 2411s if 'async' in kwargs: 2411s kwasync['async'] = kwargs.pop('async') 2411s if 'async_' in kwargs: 2411s kwasync['async_'] = kwargs.pop('async_') 2411s 2411s dsn = _ext.make_dsn(dsn, **kwargs) 2411s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2411s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2411s E Is the server running on that host and accepting TCP/IP connections? 2411s 2411s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2411s 2411s The above exception was the direct cause of the following exception: 2411s 2411s conn = 'postgresql_psycopg2_conn' 2411s request = > 2411s test_frame1 = index A B C D 2411s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2411s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2411s 2411s @pytest.mark.parametrize("conn", all_connectable) 2411s def test_api_get_schema(conn, request, test_frame1): 2411s if "adbc" in conn: 2411s request.node.add_marker( 2411s pytest.mark.xfail( 2411s reason="'get_schema' not implemented for ADBC drivers", 2411s strict=True, 2411s ) 2411s ) 2411s > conn = request.getfixturevalue(conn) 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2131: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def getfixturevalue(self, argname: str) -> Any: 2411s """Dynamically run a named fixture function. 2411s 2411s Declaring fixtures via function argument is recommended where possible. 2411s But if you can only decide whether to use another fixture at test 2411s setup time, you may use this function to retrieve it inside a fixture 2411s or test function body. 2411s 2411s This method can be used during the test setup phase or the test run 2411s phase, but during the test teardown phase a fixture's value may not 2411s be available. 2411s 2411s :param argname: 2411s The fixture name. 2411s :raises pytest.FixtureLookupError: 2411s If the given fixture could not be found. 2411s """ 2411s # Note that in addition to the use case described in the docstring, 2411s # getfixturevalue() is also called by pytest itself during item and fixture 2411s # setup to evaluate the fixtures that are requested statically 2411s # (using function parameters, autouse, etc). 2411s 2411s > fixturedef = self._get_active_fixturedef(argname) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = > 2411s argname = 'postgresql_psycopg2_conn' 2411s 2411s def _get_active_fixturedef( 2411s self, argname: str 2411s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2411s if argname == "request": 2411s cached_result = (self, [0], None) 2411s return PseudoFixtureDef(cached_result, Scope.Function) 2411s 2411s # If we already finished computing a fixture by this name in this item, 2411s # return it. 2411s fixturedef = self._fixture_defs.get(argname) 2411s if fixturedef is not None: 2411s self._check_scope(fixturedef, fixturedef._scope) 2411s return fixturedef 2411s 2411s # Find the appropriate fixturedef. 2411s fixturedefs = self._arg2fixturedefs.get(argname, None) 2411s if fixturedefs is None: 2411s # We arrive here because of a dynamic call to 2411s # getfixturevalue(argname) which was naturally 2411s # not known at parsing/collection time. 2411s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2411s if fixturedefs is not None: 2411s self._arg2fixturedefs[argname] = fixturedefs 2411s # No fixtures defined with this name. 2411s if fixturedefs is None: 2411s raise FixtureLookupError(argname, self) 2411s # The are no fixtures with this name applicable for the function. 2411s if not fixturedefs: 2411s raise FixtureLookupError(argname, self) 2411s # A fixture may override another fixture with the same name, e.g. a 2411s # fixture in a module can override a fixture in a conftest, a fixture in 2411s # a class can override a fixture in the module, and so on. 2411s # An overriding fixture can request its own name (possibly indirectly); 2411s # in this case it gets the value of the fixture it overrides, one level 2411s # up. 2411s # Check how many `argname`s deep we are, and take the next one. 2411s # `fixturedefs` is sorted from furthest to closest, so use negative 2411s # indexing to go in reverse. 2411s index = -1 2411s for request in self._iter_chain(): 2411s if request.fixturename == argname: 2411s index -= 1 2411s # If already consumed all of the available levels, fail. 2411s if -index > len(fixturedefs): 2411s raise FixtureLookupError(argname, self) 2411s fixturedef = fixturedefs[index] 2411s 2411s # Prepare a SubRequest object for calling the fixture. 2411s try: 2411s callspec = self._pyfuncitem.callspec 2411s except AttributeError: 2411s callspec = None 2411s if callspec is not None and argname in callspec.params: 2411s param = callspec.params[argname] 2411s param_index = callspec.indices[argname] 2411s # The parametrize invocation scope overrides the fixture's scope. 2411s scope = callspec._arg2scope[argname] 2411s else: 2411s param = NOTSET 2411s param_index = 0 2411s scope = fixturedef._scope 2411s self._check_fixturedef_without_param(fixturedef) 2411s self._check_scope(fixturedef, scope) 2411s subrequest = SubRequest( 2411s self, scope, param, param_index, fixturedef, _ispytest=True 2411s ) 2411s 2411s # Make sure the fixture value is cached, running it if it isn't 2411s > fixturedef.execute(request=subrequest) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s request = > 2411s 2411s def execute(self, request: SubRequest) -> FixtureValue: 2411s """Return the value of this fixture, executing it if not cached.""" 2411s # Ensure that the dependent fixtures requested by this fixture are loaded. 2411s # This needs to be done before checking if we have a cached value, since 2411s # if a dependent fixture has their cache invalidated, e.g. due to 2411s # parametrization, they finalize themselves and fixtures depending on it 2411s # (which will likely include this fixture) setting `self.cached_result = None`. 2411s # See #4871 2411s requested_fixtures_that_should_finalize_us = [] 2411s for argname in self.argnames: 2411s fixturedef = request._get_active_fixturedef(argname) 2411s # Saves requested fixtures in a list so we later can add our finalizer 2411s # to them, ensuring that if a requested fixture gets torn down we get torn 2411s # down first. This is generally handled by SetupState, but still currently 2411s # needed when this fixture is not parametrized but depends on a parametrized 2411s # fixture. 2411s if not isinstance(fixturedef, PseudoFixtureDef): 2411s requested_fixtures_that_should_finalize_us.append(fixturedef) 2411s 2411s # Check for (and return) cached value/exception. 2411s if self.cached_result is not None: 2411s request_cache_key = self.cache_key(request) 2411s cache_key = self.cached_result[1] 2411s try: 2411s # Attempt to make a normal == check: this might fail for objects 2411s # which do not implement the standard comparison (like numpy arrays -- #6497). 2411s cache_hit = bool(request_cache_key == cache_key) 2411s except (ValueError, RuntimeError): 2411s # If the comparison raises, use 'is' as fallback. 2411s cache_hit = request_cache_key is cache_key 2411s 2411s if cache_hit: 2411s if self.cached_result[2] is not None: 2411s exc, exc_tb = self.cached_result[2] 2411s raise exc.with_traceback(exc_tb) 2411s else: 2411s result = self.cached_result[0] 2411s return result 2411s # We have a previous but differently parametrized fixture instance 2411s # so we need to tear it down before creating a new one. 2411s self.finish(request) 2411s assert self.cached_result is None 2411s 2411s # Add finalizer to requested fixtures we saved previously. 2411s # We make sure to do this after checking for cached value to avoid 2411s # adding our finalizer multiple times. (#12135) 2411s finalizer = functools.partial(self.finish, request=request) 2411s for parent_fixture in requested_fixtures_that_should_finalize_us: 2411s parent_fixture.addfinalizer(finalizer) 2411s 2411s ihook = request.node.ihook 2411s try: 2411s # Setup the fixture, run the code in it, and cache the value 2411s # in self.cached_result 2411s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def __call__(self, **kwargs: object) -> Any: 2411s """Call the hook. 2411s 2411s Only accepts keyword arguments, which should match the hook 2411s specification. 2411s 2411s Returns the result(s) of calling all registered plugins, see 2411s :ref:`calling`. 2411s """ 2411s assert ( 2411s not self.is_historic() 2411s ), "Cannot directly call a historic hook - use call_historic instead." 2411s self._verify_all_args_are_provided(kwargs) 2411s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2411s # Copy because plugins may register other plugins during iteration (#438). 2411s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2411s hook_name = 'pytest_fixture_setup' 2411s methods = [>] 2411s kwargs = {'fixturedef': , 'request': >} 2411s firstresult = True 2411s 2411s def _hookexec( 2411s self, 2411s hook_name: str, 2411s methods: Sequence[HookImpl], 2411s kwargs: Mapping[str, object], 2411s firstresult: bool, 2411s ) -> object | list[object]: 2411s # called from all hookcaller instances. 2411s # enable_tracing will set its own wrapping function at self._inner_hookexec 2411s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2411s 2411s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s @pytest.hookimpl(wrapper=True) 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[object], request: SubRequest 2411s ) -> Generator[None, object, object]: 2411s try: 2411s > return (yield) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturedef = 2411s request = > 2411s 2411s def pytest_fixture_setup( 2411s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2411s ) -> FixtureValue: 2411s """Execution of fixture setup.""" 2411s kwargs = {} 2411s for argname in fixturedef.argnames: 2411s kwargs[argname] = request.getfixturevalue(argname) 2411s 2411s fixturefunc = resolve_fixture_function(fixturedef, request) 2411s my_cache_key = fixturedef.cache_key(request) 2411s try: 2411s > result = call_fixture_func(fixturefunc, request, kwargs) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s fixturefunc = 2411s request = > 2411s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2411s 2411s def call_fixture_func( 2411s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2411s ) -> FixtureValue: 2411s if is_generator(fixturefunc): 2411s fixturefunc = cast( 2411s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2411s ) 2411s generator = fixturefunc(**kwargs) 2411s try: 2411s > fixture_result = next(generator) 2411s 2411s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s @pytest.fixture 2411s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2411s > with postgresql_psycopg2_engine.connect() as conn: 2411s 2411s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def connect(self) -> Connection: 2411s """Return a new :class:`_engine.Connection` object. 2411s 2411s The :class:`_engine.Connection` acts as a Python context manager, so 2411s the typical use of this method looks like:: 2411s 2411s with engine.connect() as connection: 2411s connection.execute(text("insert into table values ('foo')")) 2411s connection.commit() 2411s 2411s Where above, after the block is completed, the connection is "closed" 2411s and its underlying DBAPI resources are returned to the connection pool. 2411s This also has the effect of rolling back any transaction that 2411s was explicitly begun or was begun via autobegin, and will 2411s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2411s started and is still in progress. 2411s 2411s .. seealso:: 2411s 2411s :meth:`_engine.Engine.begin` 2411s 2411s """ 2411s 2411s > return self._connection_cls(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s self._dbapi_connection = engine.raw_connection() 2411s except dialect.loaded_dbapi.Error as err: 2411s > Connection._handle_dbapi_exception_noconnection( 2411s err, dialect, engine 2411s ) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2411s dialect = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2411s 2411s @classmethod 2411s def _handle_dbapi_exception_noconnection( 2411s cls, 2411s e: BaseException, 2411s dialect: Dialect, 2411s engine: Optional[Engine] = None, 2411s is_disconnect: Optional[bool] = None, 2411s invalidate_pool_on_disconnect: bool = True, 2411s is_pre_ping: bool = False, 2411s ) -> NoReturn: 2411s exc_info = sys.exc_info() 2411s 2411s if is_disconnect is None: 2411s is_disconnect = isinstance( 2411s e, dialect.loaded_dbapi.Error 2411s ) and dialect.is_disconnect(e, None, None) 2411s 2411s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2411s 2411s if should_wrap: 2411s sqlalchemy_exception = exc.DBAPIError.instance( 2411s None, 2411s None, 2411s cast(Exception, e), 2411s dialect.loaded_dbapi.Error, 2411s hide_parameters=( 2411s engine.hide_parameters if engine is not None else False 2411s ), 2411s connection_invalidated=is_disconnect, 2411s dialect=dialect, 2411s ) 2411s else: 2411s sqlalchemy_exception = None 2411s 2411s newraise = None 2411s 2411s if dialect._has_events: 2411s ctx = ExceptionContextImpl( 2411s e, 2411s sqlalchemy_exception, 2411s engine, 2411s dialect, 2411s None, 2411s None, 2411s None, 2411s None, 2411s None, 2411s is_disconnect, 2411s invalidate_pool_on_disconnect, 2411s is_pre_ping, 2411s ) 2411s for fn in dialect.dispatch.handle_error: 2411s try: 2411s # handler returns an exception; 2411s # call next handler in a chain 2411s per_fn = fn(ctx) 2411s if per_fn is not None: 2411s ctx.chained_exception = newraise = per_fn 2411s except Exception as _raised: 2411s # handler raises an exception - stop processing 2411s newraise = _raised 2411s break 2411s 2411s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2411s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2411s ctx.is_disconnect 2411s ) 2411s 2411s if newraise: 2411s raise newraise.with_traceback(exc_info[2]) from e 2411s elif should_wrap: 2411s assert sqlalchemy_exception is not None 2411s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s connection = None, _has_events = None, _allow_revalidate = True 2411s _allow_autobegin = True 2411s 2411s def __init__( 2411s self, 2411s engine: Engine, 2411s connection: Optional[PoolProxiedConnection] = None, 2411s _has_events: Optional[bool] = None, 2411s _allow_revalidate: bool = True, 2411s _allow_autobegin: bool = True, 2411s ): 2411s """Construct a new Connection.""" 2411s self.engine = engine 2411s self.dialect = dialect = engine.dialect 2411s 2411s if connection is None: 2411s try: 2411s > self._dbapi_connection = engine.raw_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2411s 2411s def raw_connection(self) -> PoolProxiedConnection: 2411s """Return a "raw" DBAPI connection from the connection pool. 2411s 2411s The returned object is a proxied version of the DBAPI 2411s connection object used by the underlying driver in use. 2411s The object will have all the same behavior as the real DBAPI 2411s connection, except that its ``close()`` method will result in the 2411s connection being returned to the pool, rather than being closed 2411s for real. 2411s 2411s This method provides direct DBAPI connection access for 2411s special situations when the API provided by 2411s :class:`_engine.Connection` 2411s is not needed. When a :class:`_engine.Connection` object is already 2411s present, the DBAPI connection is available using 2411s the :attr:`_engine.Connection.connection` accessor. 2411s 2411s .. seealso:: 2411s 2411s :ref:`dbapi_connections` 2411s 2411s """ 2411s > return self.pool.connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def connect(self) -> PoolProxiedConnection: 2411s """Return a DBAPI connection from the pool. 2411s 2411s The connection is instrumented such that when its 2411s ``close()`` method is called, the connection will be returned to 2411s the pool. 2411s 2411s """ 2411s > return _ConnectionFairy._checkout(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s threadconns = None, fairy = None 2411s 2411s @classmethod 2411s def _checkout( 2411s cls, 2411s pool: Pool, 2411s threadconns: Optional[threading.local] = None, 2411s fairy: Optional[_ConnectionFairy] = None, 2411s ) -> _ConnectionFairy: 2411s if not fairy: 2411s > fairy = _ConnectionRecord.checkout(pool) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s cls = 2411s pool = 2411s 2411s @classmethod 2411s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2411s if TYPE_CHECKING: 2411s rec = cast(_ConnectionRecord, pool._do_get()) 2411s else: 2411s > rec = pool._do_get() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _do_get(self) -> ConnectionPoolEntry: 2411s > return self._create_connection() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def _create_connection(self) -> ConnectionPoolEntry: 2411s """Called by subclasses to create a new ConnectionRecord.""" 2411s 2411s > return _ConnectionRecord(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s pool = , connect = True 2411s 2411s def __init__(self, pool: Pool, connect: bool = True): 2411s self.fresh = False 2411s self.fairy_ref = None 2411s self.starttime = 0 2411s self.dbapi_connection = None 2411s 2411s self.__pool = pool 2411s if connect: 2411s > self.__connect() 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s self.dbapi_connection = connection = pool._invoke_creator(self) 2411s pool.logger.debug("Created new connection %r", connection) 2411s self.fresh = True 2411s except BaseException as e: 2411s > with util.safe_reraise(): 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s type_ = None, value = None, traceback = None 2411s 2411s def __exit__( 2411s self, 2411s type_: Optional[Type[BaseException]], 2411s value: Optional[BaseException], 2411s traceback: Optional[types.TracebackType], 2411s ) -> NoReturn: 2411s assert self._exc_info is not None 2411s # see #2703 for notes 2411s if type_ is None: 2411s exc_type, exc_value, exc_tb = self._exc_info 2411s assert exc_value is not None 2411s self._exc_info = None # remove potential circular references 2411s > raise exc_value.with_traceback(exc_tb) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s 2411s def __connect(self) -> None: 2411s pool = self.__pool 2411s 2411s # ensure any existing connection is removed, so that if 2411s # creator fails, this attribute stays None 2411s self.dbapi_connection = None 2411s try: 2411s self.starttime = time.time() 2411s > self.dbapi_connection = connection = pool._invoke_creator(self) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s connection_record = 2411s 2411s def connect( 2411s connection_record: Optional[ConnectionPoolEntry] = None, 2411s ) -> DBAPIConnection: 2411s if dialect._has_events: 2411s for fn in dialect.dispatch.do_connect: 2411s connection = cast( 2411s DBAPIConnection, 2411s fn(dialect, connection_record, cargs, cparams), 2411s ) 2411s if connection is not None: 2411s return connection 2411s 2411s > return dialect.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s self = 2411s cargs = () 2411s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s 2411s def connect(self, *cargs, **cparams): 2411s # inherits the docstring from interfaces.Dialect.connect 2411s > return self.loaded_dbapi.connect(*cargs, **cparams) 2411s 2411s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2411s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2411s 2411s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2411s connection_factory = None, cursor_factory = None 2411s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2411s kwasync = {} 2411s 2411s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2411s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ____________ test_api_get_schema_with_schema[mysql_pymysql_engine] _____________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_get_schema_with_schema(conn, request, test_frame1): 2412s # GH28486 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="'get_schema' not implemented for ADBC drivers", 2412s strict=True, 2412s ) 2412s ) 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb482bd0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb482bd0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _____________ test_api_get_schema_with_schema[mysql_pymysql_conn] ______________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_get_schema_with_schema(conn, request, test_frame1): 2412s # GH28486 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="'get_schema' not implemented for ADBC drivers", 2412s strict=True, 2412s ) 2412s ) 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb482cf0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb482cf0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _________ test_api_get_schema_with_schema[postgresql_psycopg2_engine] __________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_get_schema_with_schema(conn, request, test_frame1): 2412s # GH28486 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="'get_schema' not implemented for ADBC drivers", 2412s strict=True, 2412s ) 2412s ) 2412s conn = request.getfixturevalue(conn) 2412s > create_sql = sql.get_schema(test_frame1, "test", con=conn, schema="pypi") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2147: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s frame = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s name = 'test', keys = None 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s dtype = None, schema = 'pypi' 2412s 2412s def get_schema( 2412s frame, 2412s name: str, 2412s keys=None, 2412s con=None, 2412s dtype: DtypeArg | None = None, 2412s schema: str | None = None, 2412s ) -> str: 2412s """ 2412s Get the SQL db table schema for the given frame. 2412s 2412s Parameters 2412s ---------- 2412s frame : DataFrame 2412s name : str 2412s name of SQL table 2412s keys : string or sequence, default: None 2412s columns to use a primary key 2412s con: ADBC Connection, SQLAlchemy connectable, sqlite3 connection, default: None 2412s ADBC provides high performance I/O with native type support, where available. 2412s Using SQLAlchemy makes it possible to use any DB supported by that 2412s library 2412s If a DBAPI2 object, only sqlite3 is supported. 2412s dtype : dict of column name to SQL type, default None 2412s Optional specifying the datatype for columns. The SQL type should 2412s be a SQLAlchemy type, or a string for sqlite3 fallback connection. 2412s schema: str, default: None 2412s Optional specifying the schema to be used in creating the table. 2412s """ 2412s > with pandasSQL_builder(con=con) as pandas_sql: 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:2923: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def pandasSQL_builder( 2412s con, 2412s schema: str | None = None, 2412s need_transaction: bool = False, 2412s ) -> PandasSQL: 2412s """ 2412s Convenience function to return the correct PandasSQL subclass based on the 2412s provided parameters. Also creates a sqlalchemy connection and transaction 2412s if necessary. 2412s """ 2412s import sqlite3 2412s 2412s if isinstance(con, sqlite3.Connection) or con is None: 2412s return SQLiteDatabase(con) 2412s 2412s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2412s 2412s if isinstance(con, str) and sqlalchemy is None: 2412s raise ImportError("Using URI string without sqlalchemy installed.") 2412s 2412s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2412s > return SQLDatabase(con, schema, need_transaction) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s __________ test_api_get_schema_with_schema[postgresql_psycopg2_conn] ___________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_get_schema_with_schema(conn, request, test_frame1): 2412s # GH28486 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="'get_schema' not implemented for ADBC drivers", 2412s strict=True, 2412s ) 2412s ) 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _______________ test_api_get_schema_dtypes[mysql_pymysql_engine] _______________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_get_schema_dtypes(conn, request): 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="'get_schema' not implemented for ADBC drivers", 2412s strict=True, 2412s ) 2412s ) 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2161: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb483170>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb483170> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ________________ test_api_get_schema_dtypes[mysql_pymysql_conn] ________________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_get_schema_dtypes(conn, request): 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="'get_schema' not implemented for ADBC drivers", 2412s strict=True, 2412s ) 2412s ) 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2161: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb483290>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb483290> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ____________ test_api_get_schema_dtypes[postgresql_psycopg2_engine] ____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_get_schema_dtypes(conn, request): 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="'get_schema' not implemented for ADBC drivers", 2412s strict=True, 2412s ) 2412s ) 2412s conn_name = conn 2412s conn = request.getfixturevalue(conn) 2412s float_frame = DataFrame({"a": [1.1, 1.2], "b": [2.1, 2.2]}) 2412s 2412s if conn_name == "sqlite_buildin": 2412s dtype = "INTEGER" 2412s else: 2412s from sqlalchemy import Integer 2412s 2412s dtype = Integer 2412s > create_sql = sql.get_schema(float_frame, "test", con=conn, dtype={"b": dtype}) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2170: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s frame = a b 2412s 0 1.1 2.1 2412s 1 1.2 2.2, name = 'test', keys = None 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s dtype = {'b': }, schema = None 2412s 2412s def get_schema( 2412s frame, 2412s name: str, 2412s keys=None, 2412s con=None, 2412s dtype: DtypeArg | None = None, 2412s schema: str | None = None, 2412s ) -> str: 2412s """ 2412s Get the SQL db table schema for the given frame. 2412s 2412s Parameters 2412s ---------- 2412s frame : DataFrame 2412s name : str 2412s name of SQL table 2412s keys : string or sequence, default: None 2412s columns to use a primary key 2412s con: ADBC Connection, SQLAlchemy connectable, sqlite3 connection, default: None 2412s ADBC provides high performance I/O with native type support, where available. 2412s Using SQLAlchemy makes it possible to use any DB supported by that 2412s library 2412s If a DBAPI2 object, only sqlite3 is supported. 2412s dtype : dict of column name to SQL type, default None 2412s Optional specifying the datatype for columns. The SQL type should 2412s be a SQLAlchemy type, or a string for sqlite3 fallback connection. 2412s schema: str, default: None 2412s Optional specifying the schema to be used in creating the table. 2412s """ 2412s > with pandasSQL_builder(con=con) as pandas_sql: 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:2923: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def pandasSQL_builder( 2412s con, 2412s schema: str | None = None, 2412s need_transaction: bool = False, 2412s ) -> PandasSQL: 2412s """ 2412s Convenience function to return the correct PandasSQL subclass based on the 2412s provided parameters. Also creates a sqlalchemy connection and transaction 2412s if necessary. 2412s """ 2412s import sqlite3 2412s 2412s if isinstance(con, sqlite3.Connection) or con is None: 2412s return SQLiteDatabase(con) 2412s 2412s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2412s 2412s if isinstance(con, str) and sqlalchemy is None: 2412s raise ImportError("Using URI string without sqlalchemy installed.") 2412s 2412s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2412s > return SQLDatabase(con, schema, need_transaction) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____________ test_api_get_schema_dtypes[postgresql_psycopg2_conn] _____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_get_schema_dtypes(conn, request): 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="'get_schema' not implemented for ADBC drivers", 2412s strict=True, 2412s ) 2412s ) 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2161: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ________________ test_api_get_schema_keys[mysql_pymysql_engine] ________________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_get_schema_keys(conn, request, test_frame1): 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="'get_schema' not implemented for ADBC drivers", 2412s strict=True, 2412s ) 2412s ) 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2185: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb483e90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb483e90> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _________________ test_api_get_schema_keys[mysql_pymysql_conn] _________________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_get_schema_keys(conn, request, test_frame1): 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="'get_schema' not implemented for ADBC drivers", 2412s strict=True, 2412s ) 2412s ) 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2185: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb483dd0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb483dd0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _____________ test_api_get_schema_keys[postgresql_psycopg2_engine] _____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_get_schema_keys(conn, request, test_frame1): 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="'get_schema' not implemented for ADBC drivers", 2412s strict=True, 2412s ) 2412s ) 2412s conn_name = conn 2412s conn = request.getfixturevalue(conn) 2412s frame = DataFrame({"Col1": [1.1, 1.2], "Col2": [2.1, 2.2]}) 2412s > create_sql = sql.get_schema(frame, "test", con=conn, keys="Col1") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2187: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s frame = Col1 Col2 2412s 0 1.1 2.1 2412s 1 1.2 2.2, name = 'test', keys = 'Col1' 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s dtype = None, schema = None 2412s 2412s def get_schema( 2412s frame, 2412s name: str, 2412s keys=None, 2412s con=None, 2412s dtype: DtypeArg | None = None, 2412s schema: str | None = None, 2412s ) -> str: 2412s """ 2412s Get the SQL db table schema for the given frame. 2412s 2412s Parameters 2412s ---------- 2412s frame : DataFrame 2412s name : str 2412s name of SQL table 2412s keys : string or sequence, default: None 2412s columns to use a primary key 2412s con: ADBC Connection, SQLAlchemy connectable, sqlite3 connection, default: None 2412s ADBC provides high performance I/O with native type support, where available. 2412s Using SQLAlchemy makes it possible to use any DB supported by that 2412s library 2412s If a DBAPI2 object, only sqlite3 is supported. 2412s dtype : dict of column name to SQL type, default None 2412s Optional specifying the datatype for columns. The SQL type should 2412s be a SQLAlchemy type, or a string for sqlite3 fallback connection. 2412s schema: str, default: None 2412s Optional specifying the schema to be used in creating the table. 2412s """ 2412s > with pandasSQL_builder(con=con) as pandas_sql: 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:2923: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def pandasSQL_builder( 2412s con, 2412s schema: str | None = None, 2412s need_transaction: bool = False, 2412s ) -> PandasSQL: 2412s """ 2412s Convenience function to return the correct PandasSQL subclass based on the 2412s provided parameters. Also creates a sqlalchemy connection and transaction 2412s if necessary. 2412s """ 2412s import sqlite3 2412s 2412s if isinstance(con, sqlite3.Connection) or con is None: 2412s return SQLiteDatabase(con) 2412s 2412s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2412s 2412s if isinstance(con, str) and sqlalchemy is None: 2412s raise ImportError("Using URI string without sqlalchemy installed.") 2412s 2412s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2412s > return SQLDatabase(con, schema, need_transaction) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ______________ test_api_get_schema_keys[postgresql_psycopg2_conn] ______________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_get_schema_keys(conn, request, test_frame1): 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="'get_schema' not implemented for ADBC drivers", 2412s strict=True, 2412s ) 2412s ) 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2185: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ________________ test_api_chunksize_read[mysql_pymysql_engine] _________________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_chunksize_read(conn, request): 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2412s ) 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2211: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b8830>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b8830> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _________________ test_api_chunksize_read[mysql_pymysql_conn] __________________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_chunksize_read(conn, request): 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2412s ) 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2211: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b8950>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b8950> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _____________ test_api_chunksize_read[postgresql_psycopg2_engine] ______________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_chunksize_read(conn, request): 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2412s ) 2412s conn_name = conn 2412s conn = request.getfixturevalue(conn) 2412s > if sql.has_table("test_chunksize", conn): 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2212: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s table_name = 'test_chunksize' 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None 2412s 2412s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2412s """ 2412s Check if DataBase has named table. 2412s 2412s Parameters 2412s ---------- 2412s table_name: string 2412s Name of SQL table. 2412s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2412s ADBC provides high performance I/O with native type support, where available. 2412s Using SQLAlchemy makes it possible to use any DB supported by that 2412s library. 2412s If a DBAPI2 object, only sqlite3 is supported. 2412s schema : string, default None 2412s Name of SQL schema in database to write to (if database flavor supports 2412s this). If None, use default schema (default). 2412s 2412s Returns 2412s ------- 2412s boolean 2412s """ 2412s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def pandasSQL_builder( 2412s con, 2412s schema: str | None = None, 2412s need_transaction: bool = False, 2412s ) -> PandasSQL: 2412s """ 2412s Convenience function to return the correct PandasSQL subclass based on the 2412s provided parameters. Also creates a sqlalchemy connection and transaction 2412s if necessary. 2412s """ 2412s import sqlite3 2412s 2412s if isinstance(con, sqlite3.Connection) or con is None: 2412s return SQLiteDatabase(con) 2412s 2412s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2412s 2412s if isinstance(con, str) and sqlalchemy is None: 2412s raise ImportError("Using URI string without sqlalchemy installed.") 2412s 2412s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2412s > return SQLDatabase(con, schema, need_transaction) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ______________ test_api_chunksize_read[postgresql_psycopg2_conn] _______________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_chunksize_read(conn, request): 2412s if "adbc" in conn: 2412s request.node.add_marker( 2412s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2412s ) 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2211: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s __________________ test_api_categorical[mysql_pymysql_engine] __________________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_categorical(conn, request): 2412s if conn == "postgresql_adbc_conn": 2412s adbc = import_optional_dependency("adbc_driver_postgresql", errors="ignore") 2412s if adbc is not None and Version(adbc.__version__) < Version("0.9.0"): 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="categorical dtype not implemented for ADBC postgres driver", 2412s strict=True, 2412s ) 2412s ) 2412s # GH8624 2412s # test that categorical gets written correctly as dense column 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2266: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b9370>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b9370> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ___________________ test_api_categorical[mysql_pymysql_conn] ___________________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_categorical(conn, request): 2412s if conn == "postgresql_adbc_conn": 2412s adbc = import_optional_dependency("adbc_driver_postgresql", errors="ignore") 2412s if adbc is not None and Version(adbc.__version__) < Version("0.9.0"): 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="categorical dtype not implemented for ADBC postgres driver", 2412s strict=True, 2412s ) 2412s ) 2412s # GH8624 2412s # test that categorical gets written correctly as dense column 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2266: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b9550>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b9550> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _______________ test_api_categorical[postgresql_psycopg2_engine] _______________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_categorical(conn, request): 2412s if conn == "postgresql_adbc_conn": 2412s adbc = import_optional_dependency("adbc_driver_postgresql", errors="ignore") 2412s if adbc is not None and Version(adbc.__version__) < Version("0.9.0"): 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="categorical dtype not implemented for ADBC postgres driver", 2412s strict=True, 2412s ) 2412s ) 2412s # GH8624 2412s # test that categorical gets written correctly as dense column 2412s conn = request.getfixturevalue(conn) 2412s > if sql.has_table("test_categorical", conn): 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2267: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s table_name = 'test_categorical' 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None 2412s 2412s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2412s """ 2412s Check if DataBase has named table. 2412s 2412s Parameters 2412s ---------- 2412s table_name: string 2412s Name of SQL table. 2412s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2412s ADBC provides high performance I/O with native type support, where available. 2412s Using SQLAlchemy makes it possible to use any DB supported by that 2412s library. 2412s If a DBAPI2 object, only sqlite3 is supported. 2412s schema : string, default None 2412s Name of SQL schema in database to write to (if database flavor supports 2412s this). If None, use default schema (default). 2412s 2412s Returns 2412s ------- 2412s boolean 2412s """ 2412s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def pandasSQL_builder( 2412s con, 2412s schema: str | None = None, 2412s need_transaction: bool = False, 2412s ) -> PandasSQL: 2412s """ 2412s Convenience function to return the correct PandasSQL subclass based on the 2412s provided parameters. Also creates a sqlalchemy connection and transaction 2412s if necessary. 2412s """ 2412s import sqlite3 2412s 2412s if isinstance(con, sqlite3.Connection) or con is None: 2412s return SQLiteDatabase(con) 2412s 2412s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2412s 2412s if isinstance(con, str) and sqlalchemy is None: 2412s raise ImportError("Using URI string without sqlalchemy installed.") 2412s 2412s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2412s > return SQLDatabase(con, schema, need_transaction) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ________________ test_api_categorical[postgresql_psycopg2_conn] ________________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_categorical(conn, request): 2412s if conn == "postgresql_adbc_conn": 2412s adbc = import_optional_dependency("adbc_driver_postgresql", errors="ignore") 2412s if adbc is not None and Version(adbc.__version__) < Version("0.9.0"): 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="categorical dtype not implemented for ADBC postgres driver", 2412s strict=True, 2412s ) 2412s ) 2412s # GH8624 2412s # test that categorical gets written correctly as dense column 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2266: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ______________ test_api_unicode_column_name[mysql_pymysql_engine] ______________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_unicode_column_name(conn, request): 2412s # GH 11431 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2289: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b9eb0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b9eb0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _______________ test_api_unicode_column_name[mysql_pymysql_conn] _______________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_unicode_column_name(conn, request): 2412s # GH 11431 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2289: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b9fd0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4b9fd0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ___________ test_api_unicode_column_name[postgresql_psycopg2_engine] ___________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_unicode_column_name(conn, request): 2412s # GH 11431 2412s conn = request.getfixturevalue(conn) 2412s > if sql.has_table("test_unicode", conn): 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2290: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s table_name = 'test_unicode' 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None 2412s 2412s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2412s """ 2412s Check if DataBase has named table. 2412s 2412s Parameters 2412s ---------- 2412s table_name: string 2412s Name of SQL table. 2412s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2412s ADBC provides high performance I/O with native type support, where available. 2412s Using SQLAlchemy makes it possible to use any DB supported by that 2412s library. 2412s If a DBAPI2 object, only sqlite3 is supported. 2412s schema : string, default None 2412s Name of SQL schema in database to write to (if database flavor supports 2412s this). If None, use default schema (default). 2412s 2412s Returns 2412s ------- 2412s boolean 2412s """ 2412s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def pandasSQL_builder( 2412s con, 2412s schema: str | None = None, 2412s need_transaction: bool = False, 2412s ) -> PandasSQL: 2412s """ 2412s Convenience function to return the correct PandasSQL subclass based on the 2412s provided parameters. Also creates a sqlalchemy connection and transaction 2412s if necessary. 2412s """ 2412s import sqlite3 2412s 2412s if isinstance(con, sqlite3.Connection) or con is None: 2412s return SQLiteDatabase(con) 2412s 2412s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2412s 2412s if isinstance(con, str) and sqlalchemy is None: 2412s raise ImportError("Using URI string without sqlalchemy installed.") 2412s 2412s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2412s > return SQLDatabase(con, schema, need_transaction) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ____________ test_api_unicode_column_name[postgresql_psycopg2_conn] ____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_unicode_column_name(conn, request): 2412s # GH 11431 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2289: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ______________ test_api_escaped_table_name[mysql_pymysql_engine] _______________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_escaped_table_name(conn, request): 2412s # GH 13206 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4ba990>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4ba990> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _______________ test_api_escaped_table_name[mysql_pymysql_conn] ________________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_escaped_table_name(conn, request): 2412s # GH 13206 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4baab0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4baab0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ___________ test_api_escaped_table_name[postgresql_psycopg2_engine] ____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_escaped_table_name(conn, request): 2412s # GH 13206 2412s conn_name = conn 2412s conn = request.getfixturevalue(conn) 2412s > if sql.has_table("d1187b08-4943-4c8d-a7f6", conn): 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2303: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s table_name = 'd1187b08-4943-4c8d-a7f6' 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None 2412s 2412s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2412s """ 2412s Check if DataBase has named table. 2412s 2412s Parameters 2412s ---------- 2412s table_name: string 2412s Name of SQL table. 2412s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2412s ADBC provides high performance I/O with native type support, where available. 2412s Using SQLAlchemy makes it possible to use any DB supported by that 2412s library. 2412s If a DBAPI2 object, only sqlite3 is supported. 2412s schema : string, default None 2412s Name of SQL schema in database to write to (if database flavor supports 2412s this). If None, use default schema (default). 2412s 2412s Returns 2412s ------- 2412s boolean 2412s """ 2412s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def pandasSQL_builder( 2412s con, 2412s schema: str | None = None, 2412s need_transaction: bool = False, 2412s ) -> PandasSQL: 2412s """ 2412s Convenience function to return the correct PandasSQL subclass based on the 2412s provided parameters. Also creates a sqlalchemy connection and transaction 2412s if necessary. 2412s """ 2412s import sqlite3 2412s 2412s if isinstance(con, sqlite3.Connection) or con is None: 2412s return SQLiteDatabase(con) 2412s 2412s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2412s 2412s if isinstance(con, str) and sqlalchemy is None: 2412s raise ImportError("Using URI string without sqlalchemy installed.") 2412s 2412s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2412s > return SQLDatabase(con, schema, need_transaction) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ____________ test_api_escaped_table_name[postgresql_psycopg2_conn] _____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_escaped_table_name(conn, request): 2412s # GH 13206 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s __________ test_api_read_sql_duplicate_columns[mysql_pymysql_engine] ___________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_read_sql_duplicate_columns(conn, request): 2412s # GH#53117 2412s if "adbc" in conn: 2412s pa = pytest.importorskip("pyarrow") 2412s if not ( 2412s Version(pa.__version__) >= Version("16.0") 2412s and conn in ["sqlite_adbc_conn", "postgresql_adbc_conn"] 2412s ): 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="pyarrow->pandas throws ValueError", strict=True 2412s ) 2412s ) 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2333: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4bb4d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4bb4d0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ___________ test_api_read_sql_duplicate_columns[mysql_pymysql_conn] ____________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_read_sql_duplicate_columns(conn, request): 2412s # GH#53117 2412s if "adbc" in conn: 2412s pa = pytest.importorskip("pyarrow") 2412s if not ( 2412s Version(pa.__version__) >= Version("16.0") 2412s and conn in ["sqlite_adbc_conn", "postgresql_adbc_conn"] 2412s ): 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="pyarrow->pandas throws ValueError", strict=True 2412s ) 2412s ) 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2333: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4bb590>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4bb590> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _______ test_api_read_sql_duplicate_columns[postgresql_psycopg2_engine] ________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_read_sql_duplicate_columns(conn, request): 2412s # GH#53117 2412s if "adbc" in conn: 2412s pa = pytest.importorskip("pyarrow") 2412s if not ( 2412s Version(pa.__version__) >= Version("16.0") 2412s and conn in ["sqlite_adbc_conn", "postgresql_adbc_conn"] 2412s ): 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="pyarrow->pandas throws ValueError", strict=True 2412s ) 2412s ) 2412s conn = request.getfixturevalue(conn) 2412s > if sql.has_table("test_table", conn): 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2334: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s table_name = 'test_table' 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None 2412s 2412s def has_table(table_name: str, con, schema: str | None = None) -> bool: 2412s """ 2412s Check if DataBase has named table. 2412s 2412s Parameters 2412s ---------- 2412s table_name: string 2412s Name of SQL table. 2412s con: ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2412s ADBC provides high performance I/O with native type support, where available. 2412s Using SQLAlchemy makes it possible to use any DB supported by that 2412s library. 2412s If a DBAPI2 object, only sqlite3 is supported. 2412s schema : string, default None 2412s Name of SQL schema in database to write to (if database flavor supports 2412s this). If None, use default schema (default). 2412s 2412s Returns 2412s ------- 2412s boolean 2412s """ 2412s > with pandasSQL_builder(con, schema=schema) as pandas_sql: 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:878: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def pandasSQL_builder( 2412s con, 2412s schema: str | None = None, 2412s need_transaction: bool = False, 2412s ) -> PandasSQL: 2412s """ 2412s Convenience function to return the correct PandasSQL subclass based on the 2412s provided parameters. Also creates a sqlalchemy connection and transaction 2412s if necessary. 2412s """ 2412s import sqlite3 2412s 2412s if isinstance(con, sqlite3.Connection) or con is None: 2412s return SQLiteDatabase(con) 2412s 2412s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2412s 2412s if isinstance(con, str) and sqlalchemy is None: 2412s raise ImportError("Using URI string without sqlalchemy installed.") 2412s 2412s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2412s > return SQLDatabase(con, schema, need_transaction) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ________ test_api_read_sql_duplicate_columns[postgresql_psycopg2_conn] _________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_api_read_sql_duplicate_columns(conn, request): 2412s # GH#53117 2412s if "adbc" in conn: 2412s pa = pytest.importorskip("pyarrow") 2412s if not ( 2412s Version(pa.__version__) >= Version("16.0") 2412s and conn in ["sqlite_adbc_conn", "postgresql_adbc_conn"] 2412s ): 2412s request.node.add_marker( 2412s pytest.mark.xfail( 2412s reason="pyarrow->pandas throws ValueError", strict=True 2412s ) 2412s ) 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2333: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ________________ test_read_table_columns[mysql_pymysql_engine] _________________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_read_table_columns(conn, request, test_frame1): 2412s # test columns argument in read_table 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin": 2412s request.applymarker(pytest.mark.xfail(reason="Not Implemented")) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2356: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4bbe90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4bbe90> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _________________ test_read_table_columns[mysql_pymysql_conn] __________________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_read_table_columns(conn, request, test_frame1): 2412s # test columns argument in read_table 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin": 2412s request.applymarker(pytest.mark.xfail(reason="Not Implemented")) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2356: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4482f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4482f0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _____________ test_read_table_columns[postgresql_psycopg2_engine] ______________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_read_table_columns(conn, request, test_frame1): 2412s # test columns argument in read_table 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin": 2412s request.applymarker(pytest.mark.xfail(reason="Not Implemented")) 2412s 2412s conn = request.getfixturevalue(conn) 2412s > sql.to_sql(test_frame1, "test_frame", conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2357: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s frame = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s name = 'test_frame' 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, if_exists = 'fail', index = True, index_label = None 2412s chunksize = None, dtype = None, method = None, engine = 'auto' 2412s engine_kwargs = {} 2412s 2412s def to_sql( 2412s frame, 2412s name: str, 2412s con, 2412s schema: str | None = None, 2412s if_exists: Literal["fail", "replace", "append"] = "fail", 2412s index: bool = True, 2412s index_label: IndexLabel | None = None, 2412s chunksize: int | None = None, 2412s dtype: DtypeArg | None = None, 2412s method: Literal["multi"] | Callable | None = None, 2412s engine: str = "auto", 2412s **engine_kwargs, 2412s ) -> int | None: 2412s """ 2412s Write records stored in a DataFrame to a SQL database. 2412s 2412s Parameters 2412s ---------- 2412s frame : DataFrame, Series 2412s name : str 2412s Name of SQL table. 2412s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2412s or sqlite3 DBAPI2 connection 2412s ADBC provides high performance I/O with native type support, where available. 2412s Using SQLAlchemy makes it possible to use any DB supported by that 2412s library. 2412s If a DBAPI2 object, only sqlite3 is supported. 2412s schema : str, optional 2412s Name of SQL schema in database to write to (if database flavor 2412s supports this). If None, use default schema (default). 2412s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2412s - fail: If table exists, do nothing. 2412s - replace: If table exists, drop it, recreate it, and insert data. 2412s - append: If table exists, insert data. Create if does not exist. 2412s index : bool, default True 2412s Write DataFrame index as a column. 2412s index_label : str or sequence, optional 2412s Column label for index column(s). If None is given (default) and 2412s `index` is True, then the index names are used. 2412s A sequence should be given if the DataFrame uses MultiIndex. 2412s chunksize : int, optional 2412s Specify the number of rows in each batch to be written at a time. 2412s By default, all rows will be written at once. 2412s dtype : dict or scalar, optional 2412s Specifying the datatype for columns. If a dictionary is used, the 2412s keys should be the column names and the values should be the 2412s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2412s scalar is provided, it will be applied to all columns. 2412s method : {None, 'multi', callable}, optional 2412s Controls the SQL insertion clause used: 2412s 2412s - None : Uses standard SQL ``INSERT`` clause (one per row). 2412s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2412s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2412s 2412s Details and a sample callable implementation can be found in the 2412s section :ref:`insert method `. 2412s engine : {'auto', 'sqlalchemy'}, default 'auto' 2412s SQL engine library to use. If 'auto', then the option 2412s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2412s behavior is 'sqlalchemy' 2412s 2412s .. versionadded:: 1.3.0 2412s 2412s **engine_kwargs 2412s Any additional kwargs are passed to the engine. 2412s 2412s Returns 2412s ------- 2412s None or int 2412s Number of rows affected by to_sql. None is returned if the callable 2412s passed into ``method`` does not return an integer number of rows. 2412s 2412s .. versionadded:: 1.4.0 2412s 2412s Notes 2412s ----- 2412s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2412s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2412s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2412s rows as stipulated in the 2412s `sqlite3 `__ or 2412s `SQLAlchemy `__ 2412s """ # noqa: E501 2412s if if_exists not in ("fail", "replace", "append"): 2412s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2412s 2412s if isinstance(frame, Series): 2412s frame = frame.to_frame() 2412s elif not isinstance(frame, DataFrame): 2412s raise NotImplementedError( 2412s "'frame' argument should be either a Series or a DataFrame" 2412s ) 2412s 2412s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = True 2412s 2412s def pandasSQL_builder( 2412s con, 2412s schema: str | None = None, 2412s need_transaction: bool = False, 2412s ) -> PandasSQL: 2412s """ 2412s Convenience function to return the correct PandasSQL subclass based on the 2412s provided parameters. Also creates a sqlalchemy connection and transaction 2412s if necessary. 2412s """ 2412s import sqlite3 2412s 2412s if isinstance(con, sqlite3.Connection) or con is None: 2412s return SQLiteDatabase(con) 2412s 2412s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2412s 2412s if isinstance(con, str) and sqlalchemy is None: 2412s raise ImportError("Using URI string without sqlalchemy installed.") 2412s 2412s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2412s > return SQLDatabase(con, schema, need_transaction) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = True 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ______________ test_read_table_columns[postgresql_psycopg2_conn] _______________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_read_table_columns(conn, request, test_frame1): 2412s # test columns argument in read_table 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin": 2412s request.applymarker(pytest.mark.xfail(reason="Not Implemented")) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2356: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _______________ test_read_table_index_col[mysql_pymysql_engine] ________________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_read_table_index_col(conn, request, test_frame1): 2412s # test columns argument in read_table 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin": 2412s request.applymarker(pytest.mark.xfail(reason="Not Implemented")) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2372: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb448ef0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb448ef0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ________________ test_read_table_index_col[mysql_pymysql_conn] _________________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_read_table_index_col(conn, request, test_frame1): 2412s # test columns argument in read_table 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin": 2412s request.applymarker(pytest.mark.xfail(reason="Not Implemented")) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2372: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb448e30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb448e30> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ____________ test_read_table_index_col[postgresql_psycopg2_engine] _____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_read_table_index_col(conn, request, test_frame1): 2412s # test columns argument in read_table 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin": 2412s request.applymarker(pytest.mark.xfail(reason="Not Implemented")) 2412s 2412s conn = request.getfixturevalue(conn) 2412s > sql.to_sql(test_frame1, "test_frame", conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2373: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s frame = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s name = 'test_frame' 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, if_exists = 'fail', index = True, index_label = None 2412s chunksize = None, dtype = None, method = None, engine = 'auto' 2412s engine_kwargs = {} 2412s 2412s def to_sql( 2412s frame, 2412s name: str, 2412s con, 2412s schema: str | None = None, 2412s if_exists: Literal["fail", "replace", "append"] = "fail", 2412s index: bool = True, 2412s index_label: IndexLabel | None = None, 2412s chunksize: int | None = None, 2412s dtype: DtypeArg | None = None, 2412s method: Literal["multi"] | Callable | None = None, 2412s engine: str = "auto", 2412s **engine_kwargs, 2412s ) -> int | None: 2412s """ 2412s Write records stored in a DataFrame to a SQL database. 2412s 2412s Parameters 2412s ---------- 2412s frame : DataFrame, Series 2412s name : str 2412s Name of SQL table. 2412s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2412s or sqlite3 DBAPI2 connection 2412s ADBC provides high performance I/O with native type support, where available. 2412s Using SQLAlchemy makes it possible to use any DB supported by that 2412s library. 2412s If a DBAPI2 object, only sqlite3 is supported. 2412s schema : str, optional 2412s Name of SQL schema in database to write to (if database flavor 2412s supports this). If None, use default schema (default). 2412s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2412s - fail: If table exists, do nothing. 2412s - replace: If table exists, drop it, recreate it, and insert data. 2412s - append: If table exists, insert data. Create if does not exist. 2412s index : bool, default True 2412s Write DataFrame index as a column. 2412s index_label : str or sequence, optional 2412s Column label for index column(s). If None is given (default) and 2412s `index` is True, then the index names are used. 2412s A sequence should be given if the DataFrame uses MultiIndex. 2412s chunksize : int, optional 2412s Specify the number of rows in each batch to be written at a time. 2412s By default, all rows will be written at once. 2412s dtype : dict or scalar, optional 2412s Specifying the datatype for columns. If a dictionary is used, the 2412s keys should be the column names and the values should be the 2412s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2412s scalar is provided, it will be applied to all columns. 2412s method : {None, 'multi', callable}, optional 2412s Controls the SQL insertion clause used: 2412s 2412s - None : Uses standard SQL ``INSERT`` clause (one per row). 2412s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2412s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2412s 2412s Details and a sample callable implementation can be found in the 2412s section :ref:`insert method `. 2412s engine : {'auto', 'sqlalchemy'}, default 'auto' 2412s SQL engine library to use. If 'auto', then the option 2412s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2412s behavior is 'sqlalchemy' 2412s 2412s .. versionadded:: 1.3.0 2412s 2412s **engine_kwargs 2412s Any additional kwargs are passed to the engine. 2412s 2412s Returns 2412s ------- 2412s None or int 2412s Number of rows affected by to_sql. None is returned if the callable 2412s passed into ``method`` does not return an integer number of rows. 2412s 2412s .. versionadded:: 1.4.0 2412s 2412s Notes 2412s ----- 2412s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2412s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2412s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2412s rows as stipulated in the 2412s `sqlite3 `__ or 2412s `SQLAlchemy `__ 2412s """ # noqa: E501 2412s if if_exists not in ("fail", "replace", "append"): 2412s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2412s 2412s if isinstance(frame, Series): 2412s frame = frame.to_frame() 2412s elif not isinstance(frame, DataFrame): 2412s raise NotImplementedError( 2412s "'frame' argument should be either a Series or a DataFrame" 2412s ) 2412s 2412s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = True 2412s 2412s def pandasSQL_builder( 2412s con, 2412s schema: str | None = None, 2412s need_transaction: bool = False, 2412s ) -> PandasSQL: 2412s """ 2412s Convenience function to return the correct PandasSQL subclass based on the 2412s provided parameters. Also creates a sqlalchemy connection and transaction 2412s if necessary. 2412s """ 2412s import sqlite3 2412s 2412s if isinstance(con, sqlite3.Connection) or con is None: 2412s return SQLiteDatabase(con) 2412s 2412s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2412s 2412s if isinstance(con, str) and sqlalchemy is None: 2412s raise ImportError("Using URI string without sqlalchemy installed.") 2412s 2412s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2412s > return SQLDatabase(con, schema, need_transaction) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = True 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____________ test_read_table_index_col[postgresql_psycopg2_conn] ______________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_read_table_index_col(conn, request, test_frame1): 2412s # test columns argument in read_table 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin": 2412s request.applymarker(pytest.mark.xfail(reason="Not Implemented")) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2372: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ______________ test_read_sql_delegate[mysql_pymysql_engine_iris] _______________ 2412s conn = 'mysql_pymysql_engine_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable_iris) 2412s def test_read_sql_delegate(conn, request): 2412s if conn == "sqlite_buildin_iris": 2412s request.applymarker( 2412s pytest.mark.xfail( 2412s reason="sqlite_buildin connection does not implement read_sql_table" 2412s ) 2412s ) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2397: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4494f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb4494f0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _______________ test_read_sql_delegate[mysql_pymysql_conn_iris] ________________ 2412s conn = 'mysql_pymysql_conn_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable_iris) 2412s def test_read_sql_delegate(conn, request): 2412s if conn == "sqlite_buildin_iris": 2412s request.applymarker( 2412s pytest.mark.xfail( 2412s reason="sqlite_buildin connection does not implement read_sql_table" 2412s ) 2412s ) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2397: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb449610>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb449610> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ___________ test_read_sql_delegate[postgresql_psycopg2_engine_iris] ____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_engine_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable_iris) 2412s def test_read_sql_delegate(conn, request): 2412s if conn == "sqlite_buildin_iris": 2412s request.applymarker( 2412s pytest.mark.xfail( 2412s reason="sqlite_buildin connection does not implement read_sql_table" 2412s ) 2412s ) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2397: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_engine_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'reque... >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'reque... >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2412s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s def create_and_load_iris(conn, iris_file: Path): 2412s from sqlalchemy import insert 2412s 2412s iris = iris_table_metadata() 2412s 2412s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2412s reader = csv.reader(csvfile) 2412s header = next(reader) 2412s params = [dict(zip(header, row)) for row in reader] 2412s stmt = insert(iris).values(params) 2412s > with conn.begin() as con: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __enter__(self): 2412s # do not keep args and kwds alive unnecessarily 2412s # they are only needed for recreation, which is not possible anymore 2412s del self.args, self.kwds, self.func 2412s try: 2412s > return next(self.gen) 2412s 2412s /usr/lib/python3.13/contextlib.py:141: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @contextlib.contextmanager 2412s def begin(self) -> Iterator[Connection]: 2412s """Return a context manager delivering a :class:`_engine.Connection` 2412s with a :class:`.Transaction` established. 2412s 2412s E.g.:: 2412s 2412s with engine.begin() as conn: 2412s conn.execute( 2412s text("insert into table (x, y, z) values (1, 2, 3)") 2412s ) 2412s conn.execute(text("my_special_procedure(5)")) 2412s 2412s Upon successful operation, the :class:`.Transaction` 2412s is committed. If an error is raised, the :class:`.Transaction` 2412s is rolled back. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.connect` - procure a 2412s :class:`_engine.Connection` from 2412s an :class:`_engine.Engine`. 2412s 2412s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2412s for a particular :class:`_engine.Connection`. 2412s 2412s """ 2412s > with self.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ____________ test_read_sql_delegate[postgresql_psycopg2_conn_iris] _____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable_iris) 2412s def test_read_sql_delegate(conn, request): 2412s if conn == "sqlite_buildin_iris": 2412s request.applymarker( 2412s pytest.mark.xfail( 2412s reason="sqlite_buildin connection does not implement read_sql_table" 2412s ) 2412s ) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2397: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2412s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s def create_and_load_iris(conn, iris_file: Path): 2412s from sqlalchemy import insert 2412s 2412s iris = iris_table_metadata() 2412s 2412s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2412s reader = csv.reader(csvfile) 2412s header = next(reader) 2412s params = [dict(zip(header, row)) for row in reader] 2412s stmt = insert(iris).values(params) 2412s > with conn.begin() as con: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __enter__(self): 2412s # do not keep args and kwds alive unnecessarily 2412s # they are only needed for recreation, which is not possible anymore 2412s del self.args, self.kwds, self.func 2412s try: 2412s > return next(self.gen) 2412s 2412s /usr/lib/python3.13/contextlib.py:141: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @contextlib.contextmanager 2412s def begin(self) -> Iterator[Connection]: 2412s """Return a context manager delivering a :class:`_engine.Connection` 2412s with a :class:`.Transaction` established. 2412s 2412s E.g.:: 2412s 2412s with engine.begin() as conn: 2412s conn.execute( 2412s text("insert into table (x, y, z) values (1, 2, 3)") 2412s ) 2412s conn.execute(text("my_special_procedure(5)")) 2412s 2412s Upon successful operation, the :class:`.Transaction` 2412s is committed. If an error is raised, the :class:`.Transaction` 2412s is rolled back. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.connect` - procure a 2412s :class:`_engine.Connection` from 2412s an :class:`_engine.Engine`. 2412s 2412s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2412s for a particular :class:`_engine.Connection`. 2412s 2412s """ 2412s > with self.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ________ test_warning_case_insensitive_table_name[mysql_pymysql_engine] ________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_warning_case_insensitive_table_name(conn, request, test_frame1): 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin" or "adbc" in conn_name: 2412s request.applymarker(pytest.mark.xfail(reason="Does not raise warning")) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2438: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44a630>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44a630> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _________ test_warning_case_insensitive_table_name[mysql_pymysql_conn] _________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_warning_case_insensitive_table_name(conn, request, test_frame1): 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin" or "adbc" in conn_name: 2412s request.applymarker(pytest.mark.xfail(reason="Does not raise warning")) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2438: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44a6f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44a6f0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _____ test_warning_case_insensitive_table_name[postgresql_psycopg2_engine] _____ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_warning_case_insensitive_table_name(conn, request, test_frame1): 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin" or "adbc" in conn_name: 2412s request.applymarker(pytest.mark.xfail(reason="Does not raise warning")) 2412s 2412s conn = request.getfixturevalue(conn) 2412s # see gh-7815 2412s with tm.assert_produces_warning( 2412s UserWarning, 2412s match=( 2412s r"The provided table name 'TABLE1' is not found exactly as such in " 2412s r"the database after writing the table, possibly due to case " 2412s r"sensitivity issues. Consider using lower case table names." 2412s ), 2412s ): 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2448: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s During handling of the above exception, another exception occurred: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_warning_case_insensitive_table_name(conn, request, test_frame1): 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin" or "adbc" in conn_name: 2412s request.applymarker(pytest.mark.xfail(reason="Does not raise warning")) 2412s 2412s conn = request.getfixturevalue(conn) 2412s # see gh-7815 2412s > with tm.assert_produces_warning( 2412s UserWarning, 2412s match=( 2412s r"The provided table name 'TABLE1' is not found exactly as such in " 2412s r"the database after writing the table, possibly due to case " 2412s r"sensitivity issues. Consider using lower case table names." 2412s ), 2412s ): 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2440: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s typ = 2412s value = OperationalError('(psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection ....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s traceback = 2412s 2412s def __exit__(self, typ, value, traceback): 2412s if typ is None: 2412s try: 2412s next(self.gen) 2412s except StopIteration: 2412s return False 2412s else: 2412s try: 2412s raise RuntimeError("generator didn't stop") 2412s finally: 2412s self.gen.close() 2412s else: 2412s if value is None: 2412s # Need to force instantiation so we can reliably 2412s # tell if we get the same exception back 2412s value = typ() 2412s try: 2412s > self.gen.throw(value) 2412s 2412s /usr/lib/python3.13/contextlib.py:162: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s def _assert_caught_expected_warning( 2412s *, 2412s caught_warnings: Sequence[warnings.WarningMessage], 2412s expected_warning: type[Warning], 2412s match: str | None, 2412s check_stacklevel: bool, 2412s ) -> None: 2412s """Assert that there was the expected warning among the caught warnings.""" 2412s saw_warning = False 2412s matched_message = False 2412s unmatched_messages = [] 2412s 2412s for actual_warning in caught_warnings: 2412s if issubclass(actual_warning.category, expected_warning): 2412s saw_warning = True 2412s 2412s if check_stacklevel: 2412s _assert_raised_with_correct_stacklevel(actual_warning) 2412s 2412s if match is not None: 2412s if re.search(match, str(actual_warning.message)): 2412s matched_message = True 2412s else: 2412s unmatched_messages.append(actual_warning.message) 2412s 2412s if not saw_warning: 2412s > raise AssertionError( 2412s f"Did not see expected warning of class " 2412s f"{repr(expected_warning.__name__)}" 2412s ) 2412s E AssertionError: Did not see expected warning of class 'UserWarning' 2412s 2412s /usr/lib/python3/dist-packages/pandas/_testing/_warnings.py:152: AssertionError 2412s ______ test_warning_case_insensitive_table_name[postgresql_psycopg2_conn] ______ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_warning_case_insensitive_table_name(conn, request, test_frame1): 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin" or "adbc" in conn_name: 2412s request.applymarker(pytest.mark.xfail(reason="Does not raise warning")) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2438: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ______________ test_sqlalchemy_type_mapping[mysql_pymysql_engine] ______________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s def test_sqlalchemy_type_mapping(conn, request): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2458: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44a9f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44a9f0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _______________ test_sqlalchemy_type_mapping[mysql_pymysql_conn] _______________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s def test_sqlalchemy_type_mapping(conn, request): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2458: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44ab10>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44ab10> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ___________ test_sqlalchemy_type_mapping[postgresql_psycopg2_engine] ___________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s def test_sqlalchemy_type_mapping(conn, request): 2412s conn = request.getfixturevalue(conn) 2412s from sqlalchemy import TIMESTAMP 2412s 2412s # Test Timestamp objects (no datetime64 because of timezone) (GH9085) 2412s df = DataFrame( 2412s {"time": to_datetime(["2014-12-12 01:54", "2014-12-11 02:54"], utc=True)} 2412s ) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2465: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ____________ test_sqlalchemy_type_mapping[postgresql_psycopg2_conn] ____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s def test_sqlalchemy_type_mapping(conn, request): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2458: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____ test_sqlalchemy_integer_mapping[int8-SMALLINT-mysql_pymysql_engine] ______ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'int8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44b170>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44b170> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______ test_sqlalchemy_integer_mapping[int8-SMALLINT-mysql_pymysql_conn] _______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'int8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44b230>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44b230> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __ test_sqlalchemy_integer_mapping[int8-SMALLINT-postgresql_psycopg2_engine] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'int8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___ test_sqlalchemy_integer_mapping[int8-SMALLINT-postgresql_psycopg2_conn] ____ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'int8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____ test_sqlalchemy_integer_mapping[Int8-SMALLINT-mysql_pymysql_engine] ______ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'Int8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44b710>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44b710> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______ test_sqlalchemy_integer_mapping[Int8-SMALLINT-mysql_pymysql_conn] _______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'Int8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44b7d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44b7d0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __ test_sqlalchemy_integer_mapping[Int8-SMALLINT-postgresql_psycopg2_engine] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'Int8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___ test_sqlalchemy_integer_mapping[Int8-SMALLINT-postgresql_psycopg2_conn] ____ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'Int8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____ test_sqlalchemy_integer_mapping[uint8-SMALLINT-mysql_pymysql_engine] _____ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'uint8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44bcb0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44bcb0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______ test_sqlalchemy_integer_mapping[uint8-SMALLINT-mysql_pymysql_conn] ______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'uint8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44bd70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb44bd70> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __ test_sqlalchemy_integer_mapping[uint8-SMALLINT-postgresql_psycopg2_engine] __ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'uint8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___ test_sqlalchemy_integer_mapping[uint8-SMALLINT-postgresql_psycopg2_conn] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'uint8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____ test_sqlalchemy_integer_mapping[UInt8-SMALLINT-mysql_pymysql_engine] _____ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'UInt8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ac290>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ac290> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______ test_sqlalchemy_integer_mapping[UInt8-SMALLINT-mysql_pymysql_conn] ______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'UInt8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ac350>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ac350> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __ test_sqlalchemy_integer_mapping[UInt8-SMALLINT-postgresql_psycopg2_engine] __ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'UInt8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___ test_sqlalchemy_integer_mapping[UInt8-SMALLINT-postgresql_psycopg2_conn] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'UInt8', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____ test_sqlalchemy_integer_mapping[int16-SMALLINT-mysql_pymysql_engine] _____ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'int16', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ac830>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ac830> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______ test_sqlalchemy_integer_mapping[int16-SMALLINT-mysql_pymysql_conn] ______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'int16', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ac8f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ac8f0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __ test_sqlalchemy_integer_mapping[int16-SMALLINT-postgresql_psycopg2_engine] __ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'int16', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___ test_sqlalchemy_integer_mapping[int16-SMALLINT-postgresql_psycopg2_conn] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'int16', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____ test_sqlalchemy_integer_mapping[Int16-SMALLINT-mysql_pymysql_engine] _____ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'Int16', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6acdd0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6acdd0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______ test_sqlalchemy_integer_mapping[Int16-SMALLINT-mysql_pymysql_conn] ______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'Int16', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ace90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ace90> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __ test_sqlalchemy_integer_mapping[Int16-SMALLINT-postgresql_psycopg2_engine] __ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'Int16', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___ test_sqlalchemy_integer_mapping[Int16-SMALLINT-postgresql_psycopg2_conn] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'Int16', expected = 'SMALLINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____ test_sqlalchemy_integer_mapping[uint16-INTEGER-mysql_pymysql_engine] _____ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'uint16', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ad370>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ad370> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______ test_sqlalchemy_integer_mapping[uint16-INTEGER-mysql_pymysql_conn] ______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'uint16', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ad430>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ad430> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __ test_sqlalchemy_integer_mapping[uint16-INTEGER-postgresql_psycopg2_engine] __ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'uint16', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___ test_sqlalchemy_integer_mapping[uint16-INTEGER-postgresql_psycopg2_conn] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'uint16', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____ test_sqlalchemy_integer_mapping[UInt16-INTEGER-mysql_pymysql_engine] _____ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'UInt16', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ad910>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ad910> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______ test_sqlalchemy_integer_mapping[UInt16-INTEGER-mysql_pymysql_conn] ______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'UInt16', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ad9d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ad9d0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __ test_sqlalchemy_integer_mapping[UInt16-INTEGER-postgresql_psycopg2_engine] __ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'UInt16', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___ test_sqlalchemy_integer_mapping[UInt16-INTEGER-postgresql_psycopg2_conn] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'UInt16', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____ test_sqlalchemy_integer_mapping[int32-INTEGER-mysql_pymysql_engine] ______ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'int32', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6adeb0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6adeb0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______ test_sqlalchemy_integer_mapping[int32-INTEGER-mysql_pymysql_conn] _______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'int32', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6adf70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6adf70> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __ test_sqlalchemy_integer_mapping[int32-INTEGER-postgresql_psycopg2_engine] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'int32', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___ test_sqlalchemy_integer_mapping[int32-INTEGER-postgresql_psycopg2_conn] ____ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'int32', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____ test_sqlalchemy_integer_mapping[Int32-INTEGER-mysql_pymysql_engine] ______ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'Int32', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ae450>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ae450> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______ test_sqlalchemy_integer_mapping[Int32-INTEGER-mysql_pymysql_conn] _______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'Int32', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ae510>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ae510> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __ test_sqlalchemy_integer_mapping[Int32-INTEGER-postgresql_psycopg2_engine] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'Int32', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___ test_sqlalchemy_integer_mapping[Int32-INTEGER-postgresql_psycopg2_conn] ____ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'Int32', expected = 'INTEGER' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____ test_sqlalchemy_integer_mapping[uint32-BIGINT-mysql_pymysql_engine] ______ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'uint32', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ae9f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6ae9f0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______ test_sqlalchemy_integer_mapping[uint32-BIGINT-mysql_pymysql_conn] _______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'uint32', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6aeab0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6aeab0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __ test_sqlalchemy_integer_mapping[uint32-BIGINT-postgresql_psycopg2_engine] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'uint32', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___ test_sqlalchemy_integer_mapping[uint32-BIGINT-postgresql_psycopg2_conn] ____ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'uint32', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____ test_sqlalchemy_integer_mapping[UInt32-BIGINT-mysql_pymysql_engine] ______ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'UInt32', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6aef90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6aef90> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______ test_sqlalchemy_integer_mapping[UInt32-BIGINT-mysql_pymysql_conn] _______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'UInt32', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6af050>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6af050> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __ test_sqlalchemy_integer_mapping[UInt32-BIGINT-postgresql_psycopg2_engine] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'UInt32', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___ test_sqlalchemy_integer_mapping[UInt32-BIGINT-postgresql_psycopg2_conn] ____ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'UInt32', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ______ test_sqlalchemy_integer_mapping[int64-BIGINT-mysql_pymysql_engine] ______ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'int64', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6af530>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6af530> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _______ test_sqlalchemy_integer_mapping[int64-BIGINT-mysql_pymysql_conn] _______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'int64', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6af5f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6af5f0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ___ test_sqlalchemy_integer_mapping[int64-BIGINT-postgresql_psycopg2_engine] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'int64', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ____ test_sqlalchemy_integer_mapping[int64-BIGINT-postgresql_psycopg2_conn] ____ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'int64', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ______ test_sqlalchemy_integer_mapping[Int64-BIGINT-mysql_pymysql_engine] ______ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'Int64', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6afa70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6afa70> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _______ test_sqlalchemy_integer_mapping[Int64-BIGINT-mysql_pymysql_conn] _______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'Int64', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6afb30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6afb30> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ___ test_sqlalchemy_integer_mapping[Int64-BIGINT-postgresql_psycopg2_engine] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'Int64', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ____ test_sqlalchemy_integer_mapping[Int64-BIGINT-postgresql_psycopg2_conn] ____ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'Int64', expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _______ test_sqlalchemy_integer_mapping[int-BIGINT-mysql_pymysql_engine] _______ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = , expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6affb0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea6affb0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ________ test_sqlalchemy_integer_mapping[int-BIGINT-mysql_pymysql_conn] ________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = , expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea724110>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea724110> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ____ test_sqlalchemy_integer_mapping[int-BIGINT-postgresql_psycopg2_engine] ____ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = , expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2496: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____ test_sqlalchemy_integer_mapping[int-BIGINT-postgresql_psycopg2_conn] _____ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = , expected = 'BIGINT' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize( 2412s "integer, expected", 2412s [ 2412s ("int8", "SMALLINT"), 2412s ("Int8", "SMALLINT"), 2412s ("uint8", "SMALLINT"), 2412s ("UInt8", "SMALLINT"), 2412s ("int16", "SMALLINT"), 2412s ("Int16", "SMALLINT"), 2412s ("uint16", "INTEGER"), 2412s ("UInt16", "INTEGER"), 2412s ("int32", "INTEGER"), 2412s ("Int32", "INTEGER"), 2412s ("uint32", "BIGINT"), 2412s ("UInt32", "BIGINT"), 2412s ("int64", "BIGINT"), 2412s ("Int64", "BIGINT"), 2412s (int, "BIGINT" if np.dtype(int).name == "int64" else "INTEGER"), 2412s ], 2412s ) 2412s def test_sqlalchemy_integer_mapping(conn, request, integer, expected): 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2494: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ____ test_sqlalchemy_integer_overload_mapping[uint64-mysql_pymysql_engine] _____ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'uint64' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize("integer", ["uint64", "UInt64"]) 2412s def test_sqlalchemy_integer_overload_mapping(conn, request, integer): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2506: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7245f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7245f0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _____ test_sqlalchemy_integer_overload_mapping[uint64-mysql_pymysql_conn] ______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'uint64' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize("integer", ["uint64", "UInt64"]) 2412s def test_sqlalchemy_integer_overload_mapping(conn, request, integer): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2506: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7246b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7246b0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _ test_sqlalchemy_integer_overload_mapping[uint64-postgresql_psycopg2_engine] __ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'uint64' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize("integer", ["uint64", "UInt64"]) 2412s def test_sqlalchemy_integer_overload_mapping(conn, request, integer): 2412s conn = request.getfixturevalue(conn) 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2509: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s __ test_sqlalchemy_integer_overload_mapping[uint64-postgresql_psycopg2_conn] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'uint64' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize("integer", ["uint64", "UInt64"]) 2412s def test_sqlalchemy_integer_overload_mapping(conn, request, integer): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2506: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ____ test_sqlalchemy_integer_overload_mapping[UInt64-mysql_pymysql_engine] _____ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s integer = 'UInt64' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize("integer", ["uint64", "UInt64"]) 2412s def test_sqlalchemy_integer_overload_mapping(conn, request, integer): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2506: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea724b30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea724b30> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _____ test_sqlalchemy_integer_overload_mapping[UInt64-mysql_pymysql_conn] ______ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s integer = 'UInt64' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize("integer", ["uint64", "UInt64"]) 2412s def test_sqlalchemy_integer_overload_mapping(conn, request, integer): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2506: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea724bf0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea724bf0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _ test_sqlalchemy_integer_overload_mapping[UInt64-postgresql_psycopg2_engine] __ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s integer = 'UInt64' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize("integer", ["uint64", "UInt64"]) 2412s def test_sqlalchemy_integer_overload_mapping(conn, request, integer): 2412s conn = request.getfixturevalue(conn) 2412s # GH35076 Map pandas integer to optimal SQLAlchemy integer type 2412s df = DataFrame([0, 1], columns=["a"], dtype=integer) 2412s > with sql.SQLDatabase(conn) as db: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2509: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s __ test_sqlalchemy_integer_overload_mapping[UInt64-postgresql_psycopg2_conn] ___ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s integer = 'UInt64' 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s @pytest.mark.parametrize("integer", ["uint64", "UInt64"]) 2412s def test_sqlalchemy_integer_overload_mapping(conn, request, integer): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2506: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ________________ test_database_uri_string[mysql_pymysql_engine] ________________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_database_uri_string(conn, request, test_frame1): 2412s td.versioned_importorskip("sqlalchemy") 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2519: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea725490>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea725490> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _________________ test_database_uri_string[mysql_pymysql_conn] _________________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_database_uri_string(conn, request, test_frame1): 2412s td.versioned_importorskip("sqlalchemy") 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2519: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7255b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7255b0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______________ test_database_uri_string[postgresql_psycopg2_conn] ______________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_database_uri_string(conn, request, test_frame1): 2412s td.versioned_importorskip("sqlalchemy") 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2519: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ________ test_pg8000_sqlalchemy_passthrough_error[mysql_pymysql_engine] ________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s 2412s @td.skip_if_installed("pg8000") 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_pg8000_sqlalchemy_passthrough_error(conn, request): 2412s td.versioned_importorskip("sqlalchemy") 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2541: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea725c70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea725c70> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _________ test_pg8000_sqlalchemy_passthrough_error[mysql_pymysql_conn] _________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s 2412s @td.skip_if_installed("pg8000") 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_pg8000_sqlalchemy_passthrough_error(conn, request): 2412s td.versioned_importorskip("sqlalchemy") 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2541: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea725d30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea725d30> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______ test_pg8000_sqlalchemy_passthrough_error[postgresql_psycopg2_conn] ______ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s 2412s @td.skip_if_installed("pg8000") 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_pg8000_sqlalchemy_passthrough_error(conn, request): 2412s td.versioned_importorskip("sqlalchemy") 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2541: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ______________ test_query_by_text_obj[mysql_pymysql_engine_iris] _______________ 2412s conn = 'mysql_pymysql_engine_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2412s def test_query_by_text_obj(conn, request): 2412s # WIP : GH10846 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2553: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea726450>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea726450> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _______________ test_query_by_text_obj[mysql_pymysql_conn_iris] ________________ 2412s conn = 'mysql_pymysql_conn_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2412s def test_query_by_text_obj(conn, request): 2412s # WIP : GH10846 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2553: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea726570>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea726570> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ___________ test_query_by_text_obj[postgresql_psycopg2_engine_iris] ____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_engine_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2412s def test_query_by_text_obj(conn, request): 2412s # WIP : GH10846 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2553: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_engine_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'reque... >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'reque... >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2412s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s def create_and_load_iris(conn, iris_file: Path): 2412s from sqlalchemy import insert 2412s 2412s iris = iris_table_metadata() 2412s 2412s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2412s reader = csv.reader(csvfile) 2412s header = next(reader) 2412s params = [dict(zip(header, row)) for row in reader] 2412s stmt = insert(iris).values(params) 2412s > with conn.begin() as con: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __enter__(self): 2412s # do not keep args and kwds alive unnecessarily 2412s # they are only needed for recreation, which is not possible anymore 2412s del self.args, self.kwds, self.func 2412s try: 2412s > return next(self.gen) 2412s 2412s /usr/lib/python3.13/contextlib.py:141: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @contextlib.contextmanager 2412s def begin(self) -> Iterator[Connection]: 2412s """Return a context manager delivering a :class:`_engine.Connection` 2412s with a :class:`.Transaction` established. 2412s 2412s E.g.:: 2412s 2412s with engine.begin() as conn: 2412s conn.execute( 2412s text("insert into table (x, y, z) values (1, 2, 3)") 2412s ) 2412s conn.execute(text("my_special_procedure(5)")) 2412s 2412s Upon successful operation, the :class:`.Transaction` 2412s is committed. If an error is raised, the :class:`.Transaction` 2412s is rolled back. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.connect` - procure a 2412s :class:`_engine.Connection` from 2412s an :class:`_engine.Engine`. 2412s 2412s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2412s for a particular :class:`_engine.Connection`. 2412s 2412s """ 2412s > with self.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ____________ test_query_by_text_obj[postgresql_psycopg2_conn_iris] _____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2412s def test_query_by_text_obj(conn, request): 2412s # WIP : GH10846 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2553: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2412s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s def create_and_load_iris(conn, iris_file: Path): 2412s from sqlalchemy import insert 2412s 2412s iris = iris_table_metadata() 2412s 2412s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2412s reader = csv.reader(csvfile) 2412s header = next(reader) 2412s params = [dict(zip(header, row)) for row in reader] 2412s stmt = insert(iris).values(params) 2412s > with conn.begin() as con: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __enter__(self): 2412s # do not keep args and kwds alive unnecessarily 2412s # they are only needed for recreation, which is not possible anymore 2412s del self.args, self.kwds, self.func 2412s try: 2412s > return next(self.gen) 2412s 2412s /usr/lib/python3.13/contextlib.py:141: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @contextlib.contextmanager 2412s def begin(self) -> Iterator[Connection]: 2412s """Return a context manager delivering a :class:`_engine.Connection` 2412s with a :class:`.Transaction` established. 2412s 2412s E.g.:: 2412s 2412s with engine.begin() as conn: 2412s conn.execute( 2412s text("insert into table (x, y, z) values (1, 2, 3)") 2412s ) 2412s conn.execute(text("my_special_procedure(5)")) 2412s 2412s Upon successful operation, the :class:`.Transaction` 2412s is committed. If an error is raised, the :class:`.Transaction` 2412s is rolled back. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.connect` - procure a 2412s :class:`_engine.Connection` from 2412s an :class:`_engine.Engine`. 2412s 2412s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2412s for a particular :class:`_engine.Connection`. 2412s 2412s """ 2412s > with self.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____________ test_query_by_select_obj[mysql_pymysql_engine_iris] ______________ 2412s conn = 'mysql_pymysql_engine_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2412s def test_query_by_select_obj(conn, request): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2567: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea726c30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea726c30> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______________ test_query_by_select_obj[mysql_pymysql_conn_iris] _______________ 2412s conn = 'mysql_pymysql_conn_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2412s def test_query_by_select_obj(conn, request): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2567: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea726d50>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea726d50> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __________ test_query_by_select_obj[postgresql_psycopg2_engine_iris] ___________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_engine_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2412s def test_query_by_select_obj(conn, request): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2567: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_engine_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'reque...SubRequest 'postgresql_psycopg2_engine_iris' for >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'reque...SubRequest 'postgresql_psycopg2_engine_iris' for >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2412s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s def create_and_load_iris(conn, iris_file: Path): 2412s from sqlalchemy import insert 2412s 2412s iris = iris_table_metadata() 2412s 2412s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2412s reader = csv.reader(csvfile) 2412s header = next(reader) 2412s params = [dict(zip(header, row)) for row in reader] 2412s stmt = insert(iris).values(params) 2412s > with conn.begin() as con: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __enter__(self): 2412s # do not keep args and kwds alive unnecessarily 2412s # they are only needed for recreation, which is not possible anymore 2412s del self.args, self.kwds, self.func 2412s try: 2412s > return next(self.gen) 2412s 2412s /usr/lib/python3.13/contextlib.py:141: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @contextlib.contextmanager 2412s def begin(self) -> Iterator[Connection]: 2412s """Return a context manager delivering a :class:`_engine.Connection` 2412s with a :class:`.Transaction` established. 2412s 2412s E.g.:: 2412s 2412s with engine.begin() as conn: 2412s conn.execute( 2412s text("insert into table (x, y, z) values (1, 2, 3)") 2412s ) 2412s conn.execute(text("my_special_procedure(5)")) 2412s 2412s Upon successful operation, the :class:`.Transaction` 2412s is committed. If an error is raised, the :class:`.Transaction` 2412s is rolled back. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.connect` - procure a 2412s :class:`_engine.Connection` from 2412s an :class:`_engine.Engine`. 2412s 2412s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2412s for a particular :class:`_engine.Connection`. 2412s 2412s """ 2412s > with self.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___________ test_query_by_select_obj[postgresql_psycopg2_conn_iris] ____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2412s def test_query_by_select_obj(conn, request): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2567: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'reque... >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'reque... >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2412s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s def create_and_load_iris(conn, iris_file: Path): 2412s from sqlalchemy import insert 2412s 2412s iris = iris_table_metadata() 2412s 2412s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2412s reader = csv.reader(csvfile) 2412s header = next(reader) 2412s params = [dict(zip(header, row)) for row in reader] 2412s stmt = insert(iris).values(params) 2412s > with conn.begin() as con: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __enter__(self): 2412s # do not keep args and kwds alive unnecessarily 2412s # they are only needed for recreation, which is not possible anymore 2412s del self.args, self.kwds, self.func 2412s try: 2412s > return next(self.gen) 2412s 2412s /usr/lib/python3.13/contextlib.py:141: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @contextlib.contextmanager 2412s def begin(self) -> Iterator[Connection]: 2412s """Return a context manager delivering a :class:`_engine.Connection` 2412s with a :class:`.Transaction` established. 2412s 2412s E.g.:: 2412s 2412s with engine.begin() as conn: 2412s conn.execute( 2412s text("insert into table (x, y, z) values (1, 2, 3)") 2412s ) 2412s conn.execute(text("my_special_procedure(5)")) 2412s 2412s Upon successful operation, the :class:`.Transaction` 2412s is committed. If an error is raised, the :class:`.Transaction` 2412s is rolled back. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.connect` - procure a 2412s :class:`_engine.Connection` from 2412s an :class:`_engine.Engine`. 2412s 2412s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2412s for a particular :class:`_engine.Connection`. 2412s 2412s """ 2412s > with self.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ______________ test_column_with_percentage[mysql_pymysql_engine] _______________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_column_with_percentage(conn, request): 2412s # GH 37157 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin": 2412s request.applymarker(pytest.mark.xfail(reason="Not Implemented")) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2588: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7273b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7273b0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _______________ test_column_with_percentage[mysql_pymysql_conn] ________________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_column_with_percentage(conn, request): 2412s # GH 37157 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin": 2412s request.applymarker(pytest.mark.xfail(reason="Not Implemented")) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2588: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7274d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7274d0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ___________ test_column_with_percentage[postgresql_psycopg2_engine] ____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_column_with_percentage(conn, request): 2412s # GH 37157 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin": 2412s request.applymarker(pytest.mark.xfail(reason="Not Implemented")) 2412s 2412s conn = request.getfixturevalue(conn) 2412s df = DataFrame({"A": [0, 1, 2], "%_variation": [3, 4, 5]}) 2412s > df.to_sql(name="test_column_percentage", con=conn, index=False) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2590: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ( A %_variation 2412s 0 0 3 2412s 1 1 4 2412s 2 2 5,) 2412s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'test_column_percentage'} 2412s 2412s @wraps(func) 2412s def wrapper(*args, **kwargs): 2412s if len(args) > num_allow_args: 2412s warnings.warn( 2412s msg.format(arguments=_format_argument_list(allow_args)), 2412s FutureWarning, 2412s stacklevel=find_stack_level(), 2412s ) 2412s > return func(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = A %_variation 2412s 0 0 3 2412s 1 1 4 2412s 2 2 5 2412s name = 'test_column_percentage' 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, if_exists = 'fail', index = False, index_label = None 2412s chunksize = None, dtype = None, method = None 2412s 2412s @final 2412s @deprecate_nonkeyword_arguments( 2412s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2412s ) 2412s def to_sql( 2412s self, 2412s name: str, 2412s con, 2412s schema: str | None = None, 2412s if_exists: Literal["fail", "replace", "append"] = "fail", 2412s index: bool_t = True, 2412s index_label: IndexLabel | None = None, 2412s chunksize: int | None = None, 2412s dtype: DtypeArg | None = None, 2412s method: Literal["multi"] | Callable | None = None, 2412s ) -> int | None: 2412s """ 2412s Write records stored in a DataFrame to a SQL database. 2412s 2412s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2412s newly created, appended to, or overwritten. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s Name of SQL table. 2412s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2412s Using SQLAlchemy makes it possible to use any DB supported by that 2412s library. Legacy support is provided for sqlite3.Connection objects. The user 2412s is responsible for engine disposal and connection closure for the SQLAlchemy 2412s connectable. See `here \ 2412s `_. 2412s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2412s the transaction will not be committed. If passing a sqlite3.Connection, 2412s it will not be possible to roll back the record insertion. 2412s 2412s schema : str, optional 2412s Specify the schema (if database flavor supports this). If None, use 2412s default schema. 2412s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2412s How to behave if the table already exists. 2412s 2412s * fail: Raise a ValueError. 2412s * replace: Drop the table before inserting new values. 2412s * append: Insert new values to the existing table. 2412s 2412s index : bool, default True 2412s Write DataFrame index as a column. Uses `index_label` as the column 2412s name in the table. Creates a table index for this column. 2412s index_label : str or sequence, default None 2412s Column label for index column(s). If None is given (default) and 2412s `index` is True, then the index names are used. 2412s A sequence should be given if the DataFrame uses MultiIndex. 2412s chunksize : int, optional 2412s Specify the number of rows in each batch to be written at a time. 2412s By default, all rows will be written at once. 2412s dtype : dict or scalar, optional 2412s Specifying the datatype for columns. If a dictionary is used, the 2412s keys should be the column names and the values should be the 2412s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2412s scalar is provided, it will be applied to all columns. 2412s method : {None, 'multi', callable}, optional 2412s Controls the SQL insertion clause used: 2412s 2412s * None : Uses standard SQL ``INSERT`` clause (one per row). 2412s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2412s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2412s 2412s Details and a sample callable implementation can be found in the 2412s section :ref:`insert method `. 2412s 2412s Returns 2412s ------- 2412s None or int 2412s Number of rows affected by to_sql. None is returned if the callable 2412s passed into ``method`` does not return an integer number of rows. 2412s 2412s The number of returned rows affected is the sum of the ``rowcount`` 2412s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2412s reflect the exact number of written rows as stipulated in the 2412s `sqlite3 `__ or 2412s `SQLAlchemy `__. 2412s 2412s .. versionadded:: 1.4.0 2412s 2412s Raises 2412s ------ 2412s ValueError 2412s When the table already exists and `if_exists` is 'fail' (the 2412s default). 2412s 2412s See Also 2412s -------- 2412s read_sql : Read a DataFrame from a table. 2412s 2412s Notes 2412s ----- 2412s Timezone aware datetime columns will be written as 2412s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2412s database. Otherwise, the datetimes will be stored as timezone unaware 2412s timestamps local to the original timezone. 2412s 2412s Not all datastores support ``method="multi"``. Oracle, for example, 2412s does not support multi-value insert. 2412s 2412s References 2412s ---------- 2412s .. [1] https://docs.sqlalchemy.org 2412s .. [2] https://www.python.org/dev/peps/pep-0249/ 2412s 2412s Examples 2412s -------- 2412s Create an in-memory SQLite database. 2412s 2412s >>> from sqlalchemy import create_engine 2412s >>> engine = create_engine('sqlite://', echo=False) 2412s 2412s Create a table from scratch with 3 rows. 2412s 2412s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2412s >>> df 2412s name 2412s 0 User 1 2412s 1 User 2 2412s 2 User 3 2412s 2412s >>> df.to_sql(name='users', con=engine) 2412s 3 2412s >>> from sqlalchemy import text 2412s >>> with engine.connect() as conn: 2412s ... conn.execute(text("SELECT * FROM users")).fetchall() 2412s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2412s 2412s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2412s 2412s >>> with engine.begin() as connection: 2412s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2412s ... df1.to_sql(name='users', con=connection, if_exists='append') 2412s 2 2412s 2412s This is allowed to support operations that require that the same 2412s DBAPI connection is used for the entire operation. 2412s 2412s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2412s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2412s 2 2412s >>> with engine.connect() as conn: 2412s ... conn.execute(text("SELECT * FROM users")).fetchall() 2412s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2412s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2412s (1, 'User 7')] 2412s 2412s Overwrite the table with just ``df2``. 2412s 2412s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2412s ... index_label='id') 2412s 2 2412s >>> with engine.connect() as conn: 2412s ... conn.execute(text("SELECT * FROM users")).fetchall() 2412s [(0, 'User 6'), (1, 'User 7')] 2412s 2412s Use ``method`` to define a callable insertion method to do nothing 2412s if there's a primary key conflict on a table in a PostgreSQL database. 2412s 2412s >>> from sqlalchemy.dialects.postgresql import insert 2412s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2412s ... # "a" is the primary key in "conflict_table" 2412s ... data = [dict(zip(keys, row)) for row in data_iter] 2412s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2412s ... result = conn.execute(stmt) 2412s ... return result.rowcount 2412s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2412s 0 2412s 2412s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2412s on a primary key. 2412s 2412s >>> from sqlalchemy.dialects.mysql import insert 2412s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2412s ... # update columns "b" and "c" on primary key conflict 2412s ... data = [dict(zip(keys, row)) for row in data_iter] 2412s ... stmt = ( 2412s ... insert(table.table) 2412s ... .values(data) 2412s ... ) 2412s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2412s ... result = conn.execute(stmt) 2412s ... return result.rowcount 2412s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2412s 2 2412s 2412s Specify the dtype (especially useful for integers with missing values). 2412s Notice that while pandas is forced to store the data as floating point, 2412s the database supports nullable integers. When fetching the data with 2412s Python, we get back integer scalars. 2412s 2412s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2412s >>> df 2412s A 2412s 0 1.0 2412s 1 NaN 2412s 2 2.0 2412s 2412s >>> from sqlalchemy.types import Integer 2412s >>> df.to_sql(name='integers', con=engine, index=False, 2412s ... dtype={"A": Integer()}) 2412s 3 2412s 2412s >>> with engine.connect() as conn: 2412s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2412s [(1,), (None,), (2,)] 2412s """ # noqa: E501 2412s from pandas.io import sql 2412s 2412s > return sql.to_sql( 2412s self, 2412s name, 2412s con, 2412s schema=schema, 2412s if_exists=if_exists, 2412s index=index, 2412s index_label=index_label, 2412s chunksize=chunksize, 2412s dtype=dtype, 2412s method=method, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s frame = A %_variation 2412s 0 0 3 2412s 1 1 4 2412s 2 2 5 2412s name = 'test_column_percentage' 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, if_exists = 'fail', index = False, index_label = None 2412s chunksize = None, dtype = None, method = None, engine = 'auto' 2412s engine_kwargs = {} 2412s 2412s def to_sql( 2412s frame, 2412s name: str, 2412s con, 2412s schema: str | None = None, 2412s if_exists: Literal["fail", "replace", "append"] = "fail", 2412s index: bool = True, 2412s index_label: IndexLabel | None = None, 2412s chunksize: int | None = None, 2412s dtype: DtypeArg | None = None, 2412s method: Literal["multi"] | Callable | None = None, 2412s engine: str = "auto", 2412s **engine_kwargs, 2412s ) -> int | None: 2412s """ 2412s Write records stored in a DataFrame to a SQL database. 2412s 2412s Parameters 2412s ---------- 2412s frame : DataFrame, Series 2412s name : str 2412s Name of SQL table. 2412s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2412s or sqlite3 DBAPI2 connection 2412s ADBC provides high performance I/O with native type support, where available. 2412s Using SQLAlchemy makes it possible to use any DB supported by that 2412s library. 2412s If a DBAPI2 object, only sqlite3 is supported. 2412s schema : str, optional 2412s Name of SQL schema in database to write to (if database flavor 2412s supports this). If None, use default schema (default). 2412s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2412s - fail: If table exists, do nothing. 2412s - replace: If table exists, drop it, recreate it, and insert data. 2412s - append: If table exists, insert data. Create if does not exist. 2412s index : bool, default True 2412s Write DataFrame index as a column. 2412s index_label : str or sequence, optional 2412s Column label for index column(s). If None is given (default) and 2412s `index` is True, then the index names are used. 2412s A sequence should be given if the DataFrame uses MultiIndex. 2412s chunksize : int, optional 2412s Specify the number of rows in each batch to be written at a time. 2412s By default, all rows will be written at once. 2412s dtype : dict or scalar, optional 2412s Specifying the datatype for columns. If a dictionary is used, the 2412s keys should be the column names and the values should be the 2412s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2412s scalar is provided, it will be applied to all columns. 2412s method : {None, 'multi', callable}, optional 2412s Controls the SQL insertion clause used: 2412s 2412s - None : Uses standard SQL ``INSERT`` clause (one per row). 2412s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2412s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2412s 2412s Details and a sample callable implementation can be found in the 2412s section :ref:`insert method `. 2412s engine : {'auto', 'sqlalchemy'}, default 'auto' 2412s SQL engine library to use. If 'auto', then the option 2412s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2412s behavior is 'sqlalchemy' 2412s 2412s .. versionadded:: 1.3.0 2412s 2412s **engine_kwargs 2412s Any additional kwargs are passed to the engine. 2412s 2412s Returns 2412s ------- 2412s None or int 2412s Number of rows affected by to_sql. None is returned if the callable 2412s passed into ``method`` does not return an integer number of rows. 2412s 2412s .. versionadded:: 1.4.0 2412s 2412s Notes 2412s ----- 2412s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2412s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2412s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2412s rows as stipulated in the 2412s `sqlite3 `__ or 2412s `SQLAlchemy `__ 2412s """ # noqa: E501 2412s if if_exists not in ("fail", "replace", "append"): 2412s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2412s 2412s if isinstance(frame, Series): 2412s frame = frame.to_frame() 2412s elif not isinstance(frame, DataFrame): 2412s raise NotImplementedError( 2412s "'frame' argument should be either a Series or a DataFrame" 2412s ) 2412s 2412s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = True 2412s 2412s def pandasSQL_builder( 2412s con, 2412s schema: str | None = None, 2412s need_transaction: bool = False, 2412s ) -> PandasSQL: 2412s """ 2412s Convenience function to return the correct PandasSQL subclass based on the 2412s provided parameters. Also creates a sqlalchemy connection and transaction 2412s if necessary. 2412s """ 2412s import sqlite3 2412s 2412s if isinstance(con, sqlite3.Connection) or con is None: 2412s return SQLiteDatabase(con) 2412s 2412s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2412s 2412s if isinstance(con, str) and sqlalchemy is None: 2412s raise ImportError("Using URI string without sqlalchemy installed.") 2412s 2412s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2412s > return SQLDatabase(con, schema, need_transaction) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = True 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ____________ test_column_with_percentage[postgresql_psycopg2_conn] _____________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_column_with_percentage(conn, request): 2412s # GH 37157 2412s conn_name = conn 2412s if conn_name == "sqlite_buildin": 2412s request.applymarker(pytest.mark.xfail(reason="Not Implemented")) 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2588: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___________________ test_create_table[mysql_pymysql_engine] ____________________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s def test_create_table(conn, request): 2412s if conn == "sqlite_str": 2412s pytest.skip("sqlite_str has no inspection system") 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2676: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7884d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7884d0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ____________________ test_create_table[mysql_pymysql_conn] _____________________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s def test_create_table(conn, request): 2412s if conn == "sqlite_str": 2412s pytest.skip("sqlite_str has no inspection system") 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2676: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7885f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea7885f0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ________________ test_create_table[postgresql_psycopg2_engine] _________________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s def test_create_table(conn, request): 2412s if conn == "sqlite_str": 2412s pytest.skip("sqlite_str has no inspection system") 2412s 2412s conn = request.getfixturevalue(conn) 2412s 2412s from sqlalchemy import inspect 2412s 2412s temp_frame = DataFrame({"one": [1.0, 2.0, 3.0, 4.0], "two": [4.0, 3.0, 2.0, 1.0]}) 2412s > with sql.SQLDatabase(conn, need_transaction=True) as pandasSQL: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = True 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _________________ test_create_table[postgresql_psycopg2_conn] __________________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s def test_create_table(conn, request): 2412s if conn == "sqlite_str": 2412s pytest.skip("sqlite_str has no inspection system") 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2676: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ____________________ test_drop_table[mysql_pymysql_engine] _____________________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s def test_drop_table(conn, request): 2412s if conn == "sqlite_str": 2412s pytest.skip("sqlite_str has no inspection system") 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2697: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea788d10>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea788d10> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _____________________ test_drop_table[mysql_pymysql_conn] ______________________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s def test_drop_table(conn, request): 2412s if conn == "sqlite_str": 2412s pytest.skip("sqlite_str has no inspection system") 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2697: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea788e30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea788e30> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _________________ test_drop_table[postgresql_psycopg2_engine] __________________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s def test_drop_table(conn, request): 2412s if conn == "sqlite_str": 2412s pytest.skip("sqlite_str has no inspection system") 2412s 2412s conn = request.getfixturevalue(conn) 2412s 2412s from sqlalchemy import inspect 2412s 2412s temp_frame = DataFrame({"one": [1.0, 2.0, 3.0, 4.0], "two": [4.0, 3.0, 2.0, 1.0]}) 2412s > with sql.SQLDatabase(conn) as pandasSQL: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2702: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s __________________ test_drop_table[postgresql_psycopg2_conn] ___________________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2412s def test_drop_table(conn, request): 2412s if conn == "sqlite_str": 2412s pytest.skip("sqlite_str has no inspection system") 2412s 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2697: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _____________________ test_roundtrip[mysql_pymysql_engine] _____________________ 2412s conn = 'mysql_pymysql_engine' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_roundtrip(conn, request, test_frame1): 2412s if conn == "sqlite_str": 2412s pytest.skip("sqlite_str has no inspection system") 2412s 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2724: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea789b50>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea789b50> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______________________ test_roundtrip[mysql_pymysql_conn] ______________________ 2412s conn = 'mysql_pymysql_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_roundtrip(conn, request, test_frame1): 2412s if conn == "sqlite_str": 2412s pytest.skip("sqlite_str has no inspection system") 2412s 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2724: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea789c70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea789c70> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __________________ test_roundtrip[postgresql_psycopg2_engine] __________________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_roundtrip(conn, request, test_frame1): 2412s if conn == "sqlite_str": 2412s pytest.skip("sqlite_str has no inspection system") 2412s 2412s conn_name = conn 2412s conn = request.getfixturevalue(conn) 2412s > pandasSQL = pandasSQL_builder(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2725: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def pandasSQL_builder( 2412s con, 2412s schema: str | None = None, 2412s need_transaction: bool = False, 2412s ) -> PandasSQL: 2412s """ 2412s Convenience function to return the correct PandasSQL subclass based on the 2412s provided parameters. Also creates a sqlalchemy connection and transaction 2412s if necessary. 2412s """ 2412s import sqlite3 2412s 2412s if isinstance(con, sqlite3.Connection) or con is None: 2412s return SQLiteDatabase(con) 2412s 2412s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2412s 2412s if isinstance(con, str) and sqlalchemy is None: 2412s raise ImportError("Using URI string without sqlalchemy installed.") 2412s 2412s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2412s > return SQLDatabase(con, schema, need_transaction) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s schema = None, need_transaction = False 2412s 2412s def __init__( 2412s self, con, schema: str | None = None, need_transaction: bool = False 2412s ) -> None: 2412s from sqlalchemy import create_engine 2412s from sqlalchemy.engine import Engine 2412s from sqlalchemy.schema import MetaData 2412s 2412s # self.exit_stack cleans up the Engine and Connection and commits the 2412s # transaction if any of those objects was created below. 2412s # Cleanup happens either in self.__exit__ or at the end of the iterator 2412s # returned by read_sql when chunksize is not None. 2412s self.exit_stack = ExitStack() 2412s if isinstance(con, str): 2412s con = create_engine(con) 2412s self.exit_stack.callback(con.dispose) 2412s if isinstance(con, Engine): 2412s > con = self.exit_stack.enter_context(con.connect()) 2412s 2412s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ___________________ test_roundtrip[postgresql_psycopg2_conn] ___________________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn' 2412s request = > 2412s test_frame1 = index A B C D 2412s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2412s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2412s 2412s @pytest.mark.parametrize("conn", all_connectable) 2412s def test_roundtrip(conn, request, test_frame1): 2412s if conn == "sqlite_str": 2412s pytest.skip("sqlite_str has no inspection system") 2412s 2412s conn_name = conn 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2724: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2412s > with postgresql_psycopg2_engine.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _________________ test_execute_sql[mysql_pymysql_engine_iris] __________________ 2412s conn = 'mysql_pymysql_engine_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable_iris) 2412s def test_execute_sql(conn, request): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2742: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78a1b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78a1b0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s __________________ test_execute_sql[mysql_pymysql_conn_iris] ___________________ 2412s conn = 'mysql_pymysql_conn_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable_iris) 2412s def test_execute_sql(conn, request): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2742: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78a2d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78a2d0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s ______________ test_execute_sql[postgresql_psycopg2_engine_iris] _______________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_engine_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable_iris) 2412s def test_execute_sql(conn, request): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2742: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_engine_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2412s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s def create_and_load_iris(conn, iris_file: Path): 2412s from sqlalchemy import insert 2412s 2412s iris = iris_table_metadata() 2412s 2412s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2412s reader = csv.reader(csvfile) 2412s header = next(reader) 2412s params = [dict(zip(header, row)) for row in reader] 2412s stmt = insert(iris).values(params) 2412s > with conn.begin() as con: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __enter__(self): 2412s # do not keep args and kwds alive unnecessarily 2412s # they are only needed for recreation, which is not possible anymore 2412s del self.args, self.kwds, self.func 2412s try: 2412s > return next(self.gen) 2412s 2412s /usr/lib/python3.13/contextlib.py:141: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @contextlib.contextmanager 2412s def begin(self) -> Iterator[Connection]: 2412s """Return a context manager delivering a :class:`_engine.Connection` 2412s with a :class:`.Transaction` established. 2412s 2412s E.g.:: 2412s 2412s with engine.begin() as conn: 2412s conn.execute( 2412s text("insert into table (x, y, z) values (1, 2, 3)") 2412s ) 2412s conn.execute(text("my_special_procedure(5)")) 2412s 2412s Upon successful operation, the :class:`.Transaction` 2412s is committed. If an error is raised, the :class:`.Transaction` 2412s is rolled back. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.connect` - procure a 2412s :class:`_engine.Connection` from 2412s an :class:`_engine.Engine`. 2412s 2412s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2412s for a particular :class:`_engine.Connection`. 2412s 2412s """ 2412s > with self.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s _______________ test_execute_sql[postgresql_psycopg2_conn_iris] ________________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s 2412s The above exception was the direct cause of the following exception: 2412s 2412s conn = 'postgresql_psycopg2_conn_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", all_connectable_iris) 2412s def test_execute_sql(conn, request): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2742: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_conn_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'postgresql_psycopg2_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s @pytest.fixture 2412s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2412s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2412s 2412s def create_and_load_iris(conn, iris_file: Path): 2412s from sqlalchemy import insert 2412s 2412s iris = iris_table_metadata() 2412s 2412s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2412s reader = csv.reader(csvfile) 2412s header = next(reader) 2412s params = [dict(zip(header, row)) for row in reader] 2412s stmt = insert(iris).values(params) 2412s > with conn.begin() as con: 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __enter__(self): 2412s # do not keep args and kwds alive unnecessarily 2412s # they are only needed for recreation, which is not possible anymore 2412s del self.args, self.kwds, self.func 2412s try: 2412s > return next(self.gen) 2412s 2412s /usr/lib/python3.13/contextlib.py:141: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s @contextlib.contextmanager 2412s def begin(self) -> Iterator[Connection]: 2412s """Return a context manager delivering a :class:`_engine.Connection` 2412s with a :class:`.Transaction` established. 2412s 2412s E.g.:: 2412s 2412s with engine.begin() as conn: 2412s conn.execute( 2412s text("insert into table (x, y, z) values (1, 2, 3)") 2412s ) 2412s conn.execute(text("my_special_procedure(5)")) 2412s 2412s Upon successful operation, the :class:`.Transaction` 2412s is committed. If an error is raised, the :class:`.Transaction` 2412s is rolled back. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.connect` - procure a 2412s :class:`_engine.Connection` from 2412s an :class:`_engine.Engine`. 2412s 2412s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2412s for a particular :class:`_engine.Connection`. 2412s 2412s """ 2412s > with self.connect() as conn: 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def connect(self) -> Connection: 2412s """Return a new :class:`_engine.Connection` object. 2412s 2412s The :class:`_engine.Connection` acts as a Python context manager, so 2412s the typical use of this method looks like:: 2412s 2412s with engine.connect() as connection: 2412s connection.execute(text("insert into table values ('foo')")) 2412s connection.commit() 2412s 2412s Where above, after the block is completed, the connection is "closed" 2412s and its underlying DBAPI resources are returned to the connection pool. 2412s This also has the effect of rolling back any transaction that 2412s was explicitly begun or was begun via autobegin, and will 2412s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2412s started and is still in progress. 2412s 2412s .. seealso:: 2412s 2412s :meth:`_engine.Engine.begin` 2412s 2412s """ 2412s 2412s > return self._connection_cls(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s self._dbapi_connection = engine.raw_connection() 2412s except dialect.loaded_dbapi.Error as err: 2412s > Connection._handle_dbapi_exception_noconnection( 2412s err, dialect, engine 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2412s dialect = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2412s 2412s @classmethod 2412s def _handle_dbapi_exception_noconnection( 2412s cls, 2412s e: BaseException, 2412s dialect: Dialect, 2412s engine: Optional[Engine] = None, 2412s is_disconnect: Optional[bool] = None, 2412s invalidate_pool_on_disconnect: bool = True, 2412s is_pre_ping: bool = False, 2412s ) -> NoReturn: 2412s exc_info = sys.exc_info() 2412s 2412s if is_disconnect is None: 2412s is_disconnect = isinstance( 2412s e, dialect.loaded_dbapi.Error 2412s ) and dialect.is_disconnect(e, None, None) 2412s 2412s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2412s 2412s if should_wrap: 2412s sqlalchemy_exception = exc.DBAPIError.instance( 2412s None, 2412s None, 2412s cast(Exception, e), 2412s dialect.loaded_dbapi.Error, 2412s hide_parameters=( 2412s engine.hide_parameters if engine is not None else False 2412s ), 2412s connection_invalidated=is_disconnect, 2412s dialect=dialect, 2412s ) 2412s else: 2412s sqlalchemy_exception = None 2412s 2412s newraise = None 2412s 2412s if dialect._has_events: 2412s ctx = ExceptionContextImpl( 2412s e, 2412s sqlalchemy_exception, 2412s engine, 2412s dialect, 2412s None, 2412s None, 2412s None, 2412s None, 2412s None, 2412s is_disconnect, 2412s invalidate_pool_on_disconnect, 2412s is_pre_ping, 2412s ) 2412s for fn in dialect.dispatch.handle_error: 2412s try: 2412s # handler returns an exception; 2412s # call next handler in a chain 2412s per_fn = fn(ctx) 2412s if per_fn is not None: 2412s ctx.chained_exception = newraise = per_fn 2412s except Exception as _raised: 2412s # handler raises an exception - stop processing 2412s newraise = _raised 2412s break 2412s 2412s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2412s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2412s ctx.is_disconnect 2412s ) 2412s 2412s if newraise: 2412s raise newraise.with_traceback(exc_info[2]) from e 2412s elif should_wrap: 2412s assert sqlalchemy_exception is not None 2412s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2412s if dialect._has_events: 2412s for fn in dialect.dispatch.do_connect: 2412s connection = cast( 2412s DBAPIConnection, 2412s fn(dialect, connection_record, cargs, cparams), 2412s ) 2412s if connection is not None: 2412s return connection 2412s 2412s > return dialect.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s cargs = () 2412s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s 2412s def connect(self, *cargs, **cparams): 2412s # inherits the docstring from interfaces.Dialect.connect 2412s > return self.loaded_dbapi.connect(*cargs, **cparams) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2412s connection_factory = None, cursor_factory = None 2412s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2412s kwasync = {} 2412s 2412s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2412s """ 2412s Create a new database connection. 2412s 2412s The connection parameters can be specified as a string: 2412s 2412s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2412s 2412s or using a set of keyword arguments: 2412s 2412s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2412s 2412s Or as a mix of both. The basic connection parameters are: 2412s 2412s - *dbname*: the database name 2412s - *database*: the database name (only as keyword argument) 2412s - *user*: user name used to authenticate 2412s - *password*: password used to authenticate 2412s - *host*: database host address (defaults to UNIX socket if not provided) 2412s - *port*: connection port number (defaults to 5432 if not provided) 2412s 2412s Using the *connection_factory* parameter a different class or connections 2412s factory can be specified. It should be a callable object taking a dsn 2412s argument. 2412s 2412s Using the *cursor_factory* parameter, a new default cursor factory will be 2412s used by cursor(). 2412s 2412s Using *async*=True an asynchronous connection will be created. *async_* is 2412s a valid alias (for Python versions where ``async`` is a keyword). 2412s 2412s Any other keyword parameter will be passed to the underlying client 2412s library: the list of supported parameters depends on the library version. 2412s 2412s """ 2412s kwasync = {} 2412s if 'async' in kwargs: 2412s kwasync['async'] = kwargs.pop('async') 2412s if 'async_' in kwargs: 2412s kwasync['async_'] = kwargs.pop('async_') 2412s 2412s dsn = _ext.make_dsn(dsn, **kwargs) 2412s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2412s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2412s E Is the server running on that host and accepting TCP/IP connections? 2412s E 2412s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2412s 2412s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2412s ____________ test_sqlalchemy_read_table[mysql_pymysql_engine_iris] _____________ 2412s conn = 'mysql_pymysql_engine_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2412s def test_sqlalchemy_read_table(conn, request): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2753: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78acf0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78acf0> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _____________ test_sqlalchemy_read_table[mysql_pymysql_conn_iris] ______________ 2412s conn = 'mysql_pymysql_conn_iris' 2412s request = > 2412s 2412s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2412s def test_sqlalchemy_read_table(conn, request): 2412s > conn = request.getfixturevalue(conn) 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2753: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn_iris' 2412s 2412s def getfixturevalue(self, argname: str) -> Any: 2412s """Dynamically run a named fixture function. 2412s 2412s Declaring fixtures via function argument is recommended where possible. 2412s But if you can only decide whether to use another fixture at test 2412s setup time, you may use this function to retrieve it inside a fixture 2412s or test function body. 2412s 2412s This method can be used during the test setup phase or the test run 2412s phase, but during the test teardown phase a fixture's value may not 2412s be available. 2412s 2412s :param argname: 2412s The fixture name. 2412s :raises pytest.FixtureLookupError: 2412s If the given fixture could not be found. 2412s """ 2412s # Note that in addition to the use case described in the docstring, 2412s # getfixturevalue() is also called by pytest itself during item and fixture 2412s # setup to evaluate the fixtures that are requested statically 2412s # (using function parameters, autouse, etc). 2412s 2412s > fixturedef = self._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_conn_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine_iris' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s > fixturedef = request._get_active_fixturedef(argname) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = > 2412s argname = 'mysql_pymysql_engine' 2412s 2412s def _get_active_fixturedef( 2412s self, argname: str 2412s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2412s if argname == "request": 2412s cached_result = (self, [0], None) 2412s return PseudoFixtureDef(cached_result, Scope.Function) 2412s 2412s # If we already finished computing a fixture by this name in this item, 2412s # return it. 2412s fixturedef = self._fixture_defs.get(argname) 2412s if fixturedef is not None: 2412s self._check_scope(fixturedef, fixturedef._scope) 2412s return fixturedef 2412s 2412s # Find the appropriate fixturedef. 2412s fixturedefs = self._arg2fixturedefs.get(argname, None) 2412s if fixturedefs is None: 2412s # We arrive here because of a dynamic call to 2412s # getfixturevalue(argname) which was naturally 2412s # not known at parsing/collection time. 2412s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2412s if fixturedefs is not None: 2412s self._arg2fixturedefs[argname] = fixturedefs 2412s # No fixtures defined with this name. 2412s if fixturedefs is None: 2412s raise FixtureLookupError(argname, self) 2412s # The are no fixtures with this name applicable for the function. 2412s if not fixturedefs: 2412s raise FixtureLookupError(argname, self) 2412s # A fixture may override another fixture with the same name, e.g. a 2412s # fixture in a module can override a fixture in a conftest, a fixture in 2412s # a class can override a fixture in the module, and so on. 2412s # An overriding fixture can request its own name (possibly indirectly); 2412s # in this case it gets the value of the fixture it overrides, one level 2412s # up. 2412s # Check how many `argname`s deep we are, and take the next one. 2412s # `fixturedefs` is sorted from furthest to closest, so use negative 2412s # indexing to go in reverse. 2412s index = -1 2412s for request in self._iter_chain(): 2412s if request.fixturename == argname: 2412s index -= 1 2412s # If already consumed all of the available levels, fail. 2412s if -index > len(fixturedefs): 2412s raise FixtureLookupError(argname, self) 2412s fixturedef = fixturedefs[index] 2412s 2412s # Prepare a SubRequest object for calling the fixture. 2412s try: 2412s callspec = self._pyfuncitem.callspec 2412s except AttributeError: 2412s callspec = None 2412s if callspec is not None and argname in callspec.params: 2412s param = callspec.params[argname] 2412s param_index = callspec.indices[argname] 2412s # The parametrize invocation scope overrides the fixture's scope. 2412s scope = callspec._arg2scope[argname] 2412s else: 2412s param = NOTSET 2412s param_index = 0 2412s scope = fixturedef._scope 2412s self._check_fixturedef_without_param(fixturedef) 2412s self._check_scope(fixturedef, scope) 2412s subrequest = SubRequest( 2412s self, scope, param, param_index, fixturedef, _ispytest=True 2412s ) 2412s 2412s # Make sure the fixture value is cached, running it if it isn't 2412s > fixturedef.execute(request=subrequest) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s request = > 2412s 2412s def execute(self, request: SubRequest) -> FixtureValue: 2412s """Return the value of this fixture, executing it if not cached.""" 2412s # Ensure that the dependent fixtures requested by this fixture are loaded. 2412s # This needs to be done before checking if we have a cached value, since 2412s # if a dependent fixture has their cache invalidated, e.g. due to 2412s # parametrization, they finalize themselves and fixtures depending on it 2412s # (which will likely include this fixture) setting `self.cached_result = None`. 2412s # See #4871 2412s requested_fixtures_that_should_finalize_us = [] 2412s for argname in self.argnames: 2412s fixturedef = request._get_active_fixturedef(argname) 2412s # Saves requested fixtures in a list so we later can add our finalizer 2412s # to them, ensuring that if a requested fixture gets torn down we get torn 2412s # down first. This is generally handled by SetupState, but still currently 2412s # needed when this fixture is not parametrized but depends on a parametrized 2412s # fixture. 2412s if not isinstance(fixturedef, PseudoFixtureDef): 2412s requested_fixtures_that_should_finalize_us.append(fixturedef) 2412s 2412s # Check for (and return) cached value/exception. 2412s if self.cached_result is not None: 2412s request_cache_key = self.cache_key(request) 2412s cache_key = self.cached_result[1] 2412s try: 2412s # Attempt to make a normal == check: this might fail for objects 2412s # which do not implement the standard comparison (like numpy arrays -- #6497). 2412s cache_hit = bool(request_cache_key == cache_key) 2412s except (ValueError, RuntimeError): 2412s # If the comparison raises, use 'is' as fallback. 2412s cache_hit = request_cache_key is cache_key 2412s 2412s if cache_hit: 2412s if self.cached_result[2] is not None: 2412s exc, exc_tb = self.cached_result[2] 2412s raise exc.with_traceback(exc_tb) 2412s else: 2412s result = self.cached_result[0] 2412s return result 2412s # We have a previous but differently parametrized fixture instance 2412s # so we need to tear it down before creating a new one. 2412s self.finish(request) 2412s assert self.cached_result is None 2412s 2412s # Add finalizer to requested fixtures we saved previously. 2412s # We make sure to do this after checking for cached value to avoid 2412s # adding our finalizer multiple times. (#12135) 2412s finalizer = functools.partial(self.finish, request=request) 2412s for parent_fixture in requested_fixtures_that_should_finalize_us: 2412s parent_fixture.addfinalizer(finalizer) 2412s 2412s ihook = request.node.ihook 2412s try: 2412s # Setup the fixture, run the code in it, and cache the value 2412s # in self.cached_result 2412s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def __call__(self, **kwargs: object) -> Any: 2412s """Call the hook. 2412s 2412s Only accepts keyword arguments, which should match the hook 2412s specification. 2412s 2412s Returns the result(s) of calling all registered plugins, see 2412s :ref:`calling`. 2412s """ 2412s assert ( 2412s not self.is_historic() 2412s ), "Cannot directly call a historic hook - use call_historic instead." 2412s self._verify_all_args_are_provided(kwargs) 2412s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2412s # Copy because plugins may register other plugins during iteration (#438). 2412s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2412s hook_name = 'pytest_fixture_setup' 2412s methods = [>] 2412s kwargs = {'fixturedef': , 'request': >} 2412s firstresult = True 2412s 2412s def _hookexec( 2412s self, 2412s hook_name: str, 2412s methods: Sequence[HookImpl], 2412s kwargs: Mapping[str, object], 2412s firstresult: bool, 2412s ) -> object | list[object]: 2412s # called from all hookcaller instances. 2412s # enable_tracing will set its own wrapping function at self._inner_hookexec 2412s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2412s 2412s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s @pytest.hookimpl(wrapper=True) 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[object], request: SubRequest 2412s ) -> Generator[None, object, object]: 2412s try: 2412s > return (yield) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturedef = 2412s request = > 2412s 2412s def pytest_fixture_setup( 2412s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2412s ) -> FixtureValue: 2412s """Execution of fixture setup.""" 2412s kwargs = {} 2412s for argname in fixturedef.argnames: 2412s kwargs[argname] = request.getfixturevalue(argname) 2412s 2412s fixturefunc = resolve_fixture_function(fixturedef, request) 2412s my_cache_key = fixturedef.cache_key(request) 2412s try: 2412s > result = call_fixture_func(fixturefunc, request, kwargs) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s fixturefunc = 2412s request = > 2412s kwargs = {} 2412s 2412s def call_fixture_func( 2412s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2412s ) -> FixtureValue: 2412s if is_generator(fixturefunc): 2412s fixturefunc = cast( 2412s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2412s ) 2412s generator = fixturefunc(**kwargs) 2412s try: 2412s > fixture_result = next(generator) 2412s 2412s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s @pytest.fixture 2412s def mysql_pymysql_engine(): 2412s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2412s > pymysql = td.versioned_importorskip("pymysql") 2412s 2412s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s args = ('pymysql',), kwargs = {} 2412s 2412s def versioned_importorskip(*args, **kwargs): 2412s """ 2412s (warning - this is currently Debian-specific, the name may change if upstream request this) 2412s 2412s Return the requested module, or skip the test if it is 2412s not available in a new enough version. 2412s 2412s Intended as a replacement for pytest.importorskip that 2412s defaults to requiring at least pandas' minimum version for that 2412s optional dependency, rather than any version. 2412s 2412s See import_optional_dependency for full parameter documentation. 2412s """ 2412s try: 2412s > module = import_optional_dependency(*args, **kwargs) 2412s 2412s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2412s 2412s def import_optional_dependency( 2412s name: str, 2412s extra: str = "", 2412s errors: str = "raise", 2412s min_version: str | None = None, 2412s ): 2412s """ 2412s Import an optional dependency. 2412s 2412s By default, if a dependency is missing an ImportError with a nice 2412s message will be raised. If a dependency is present, but too old, 2412s we raise. 2412s 2412s Parameters 2412s ---------- 2412s name : str 2412s The module name. 2412s extra : str 2412s Additional text to include in the ImportError message. 2412s errors : str {'raise', 'warn', 'ignore'} 2412s What to do when a dependency is not found or its version is too old. 2412s 2412s * raise : Raise an ImportError 2412s * warn : Only applicable when a module's version is to old. 2412s Warns that the version is too old and returns None 2412s * ignore: If the module is not installed, return None, otherwise, 2412s return the module, even if the version is too old. 2412s It's expected that users validate the version locally when 2412s using ``errors="ignore"`` (see. ``io/html.py``) 2412s min_version : str, default None 2412s Specify a minimum version that is different from the global pandas 2412s minimum version required. 2412s Returns 2412s ------- 2412s maybe_module : Optional[ModuleType] 2412s The imported module, when found and the version is correct. 2412s None is returned when the package is not found and `errors` 2412s is False, or when the package's version is too old and `errors` 2412s is ``'warn'`` or ``'ignore'``. 2412s """ 2412s assert errors in {"warn", "raise", "ignore"} 2412s if name=='numba' and warn_numba_platform: 2412s warnings.warn(warn_numba_platform) 2412s 2412s package_name = INSTALL_MAPPING.get(name) 2412s install_name = package_name if package_name is not None else name 2412s 2412s msg = ( 2412s f"Missing optional dependency '{install_name}'. {extra} " 2412s f"Use pip or conda to install {install_name}." 2412s ) 2412s try: 2412s > module = importlib.import_module(name) 2412s 2412s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None 2412s 2412s def import_module(name, package=None): 2412s """Import a module. 2412s 2412s The 'package' argument is required when performing a relative import. It 2412s specifies the package to use as the anchor point from which to resolve the 2412s relative import to an absolute import. 2412s 2412s """ 2412s level = 0 2412s if name.startswith('.'): 2412s if not package: 2412s raise TypeError("the 'package' argument is required to perform a " 2412s f"relative import for {name!r}") 2412s for character in name: 2412s if character != '.': 2412s break 2412s level += 1 2412s > return _bootstrap._gcd_import(name[level:], package, level) 2412s 2412s /usr/lib/python3.13/importlib/__init__.py:88: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', package = None, level = 0 2412s 2412s > ??? 2412s 2412s :1387: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1360: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s name = 'pymysql', import_ = 2412s 2412s > ??? 2412s 2412s :1331: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78ae10>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2412s 2412s > ??? 2412s 2412s :935: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78ae10> 2412s module = 2412s 2412s > ??? 2412s 2412s :1022: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s f = 2412s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2412s kwds = {} 2412s 2412s > ??? 2412s 2412s :488: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s PyMySQL: A pure-Python MySQL client library. 2412s 2412s Copyright (c) 2010-2016 PyMySQL contributors 2412s 2412s Permission is hereby granted, free of charge, to any person obtaining a copy 2412s of this software and associated documentation files (the "Software"), to deal 2412s in the Software without restriction, including without limitation the rights 2412s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2412s copies of the Software, and to permit persons to whom the Software is 2412s furnished to do so, subject to the following conditions: 2412s 2412s The above copyright notice and this permission notice shall be included in 2412s all copies or substantial portions of the Software. 2412s 2412s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2412s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2412s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2412s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2412s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2412s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2412s THE SOFTWARE. 2412s """ 2412s 2412s import sys 2412s 2412s from .constants import FIELD_TYPE 2412s from .err import ( 2412s Warning, 2412s Error, 2412s InterfaceError, 2412s DataError, 2412s DatabaseError, 2412s OperationalError, 2412s IntegrityError, 2412s InternalError, 2412s NotSupportedError, 2412s ProgrammingError, 2412s MySQLError, 2412s ) 2412s from .times import ( 2412s Date, 2412s Time, 2412s Timestamp, 2412s DateFromTicks, 2412s TimeFromTicks, 2412s TimestampFromTicks, 2412s ) 2412s 2412s # PyMySQL version. 2412s # Used by setuptools and connection_attrs 2412s VERSION = (1, 1, 1, "final", 1) 2412s VERSION_STRING = "1.1.1" 2412s 2412s ### for mysqlclient compatibility 2412s ### Django checks mysqlclient version. 2412s version_info = (1, 4, 6, "final", 1) 2412s __version__ = "1.4.6" 2412s 2412s 2412s def get_client_info(): # for MySQLdb compatibility 2412s return __version__ 2412s 2412s 2412s def install_as_MySQLdb(): 2412s """ 2412s After this function is called, any application that imports MySQLdb 2412s will unwittingly actually use pymysql. 2412s """ 2412s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2412s 2412s 2412s # end of mysqlclient compatibility code 2412s 2412s threadsafety = 1 2412s apilevel = "2.0" 2412s paramstyle = "pyformat" 2412s 2412s > from . import connections # noqa: E402 2412s 2412s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # Python implementation of the MySQL client-server protocol 2412s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2412s # Error codes: 2412s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2412s import errno 2412s import os 2412s import socket 2412s import struct 2412s import sys 2412s import traceback 2412s import warnings 2412s 2412s > from . import _auth 2412s 2412s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s """ 2412s Implements auth methods 2412s """ 2412s 2412s from .err import OperationalError 2412s 2412s 2412s try: 2412s from cryptography.hazmat.backends import default_backend 2412s > from cryptography.hazmat.primitives import serialization, hashes 2412s 2412s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s > from cryptography.hazmat.primitives._serialization import ( 2412s BestAvailableEncryption, 2412s Encoding, 2412s KeySerializationEncryption, 2412s NoEncryption, 2412s ParameterFormat, 2412s PrivateFormat, 2412s PublicFormat, 2412s _KeySerializationEncryption, 2412s ) 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography import utils 2412s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s # This file is dual licensed under the terms of the Apache License, Version 2412s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2412s # for complete details. 2412s 2412s from __future__ import annotations 2412s 2412s import abc 2412s 2412s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2412s 2412s __all__ = [ 2412s "HashAlgorithm", 2412s "HashContext", 2412s "Hash", 2412s "ExtendableOutputFunction", 2412s "SHA1", 2412s "SHA512_224", 2412s "SHA512_256", 2412s "SHA224", 2412s "SHA256", 2412s "SHA384", 2412s "SHA512", 2412s "SHA3_224", 2412s "SHA3_256", 2412s "SHA3_384", 2412s "SHA3_512", 2412s "SHAKE128", 2412s "SHAKE256", 2412s "MD5", 2412s "BLAKE2b", 2412s "BLAKE2s", 2412s "SM3", 2412s ] 2412s 2412s 2412s class HashAlgorithm(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def name(self) -> str: 2412s """ 2412s A string naming this algorithm (e.g. "sha256", "md5"). 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def digest_size(self) -> int: 2412s """ 2412s The size of the resulting digest in bytes. 2412s """ 2412s 2412s @property 2412s @abc.abstractmethod 2412s def block_size(self) -> int | None: 2412s """ 2412s The internal block size of the hash function, or None if the hash 2412s function does not use blocks internally (e.g. SHA3). 2412s """ 2412s 2412s 2412s class HashContext(metaclass=abc.ABCMeta): 2412s @property 2412s @abc.abstractmethod 2412s def algorithm(self) -> HashAlgorithm: 2412s """ 2412s A HashAlgorithm that will be used by this context. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def update(self, data: bytes) -> None: 2412s """ 2412s Processes the provided bytes through the hash. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def finalize(self) -> bytes: 2412s """ 2412s Finalizes the hash context and returns the hash digest as bytes. 2412s """ 2412s 2412s @abc.abstractmethod 2412s def copy(self) -> HashContext: 2412s """ 2412s Return a HashContext that is a copy of the current context. 2412s """ 2412s 2412s 2412s > Hash = rust_openssl.hashes.Hash 2412s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2412s 2412s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2412s _________ test_sqlalchemy_read_table[postgresql_psycopg2_engine_iris] __________ 2412s self = 2412s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s connection = None, _has_events = None, _allow_revalidate = True 2412s _allow_autobegin = True 2412s 2412s def __init__( 2412s self, 2412s engine: Engine, 2412s connection: Optional[PoolProxiedConnection] = None, 2412s _has_events: Optional[bool] = None, 2412s _allow_revalidate: bool = True, 2412s _allow_autobegin: bool = True, 2412s ): 2412s """Construct a new Connection.""" 2412s self.engine = engine 2412s self.dialect = dialect = engine.dialect 2412s 2412s if connection is None: 2412s try: 2412s > self._dbapi_connection = engine.raw_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2412s 2412s def raw_connection(self) -> PoolProxiedConnection: 2412s """Return a "raw" DBAPI connection from the connection pool. 2412s 2412s The returned object is a proxied version of the DBAPI 2412s connection object used by the underlying driver in use. 2412s The object will have all the same behavior as the real DBAPI 2412s connection, except that its ``close()`` method will result in the 2412s connection being returned to the pool, rather than being closed 2412s for real. 2412s 2412s This method provides direct DBAPI connection access for 2412s special situations when the API provided by 2412s :class:`_engine.Connection` 2412s is not needed. When a :class:`_engine.Connection` object is already 2412s present, the DBAPI connection is available using 2412s the :attr:`_engine.Connection.connection` accessor. 2412s 2412s .. seealso:: 2412s 2412s :ref:`dbapi_connections` 2412s 2412s """ 2412s > return self.pool.connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def connect(self) -> PoolProxiedConnection: 2412s """Return a DBAPI connection from the pool. 2412s 2412s The connection is instrumented such that when its 2412s ``close()`` method is called, the connection will be returned to 2412s the pool. 2412s 2412s """ 2412s > return _ConnectionFairy._checkout(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s threadconns = None, fairy = None 2412s 2412s @classmethod 2412s def _checkout( 2412s cls, 2412s pool: Pool, 2412s threadconns: Optional[threading.local] = None, 2412s fairy: Optional[_ConnectionFairy] = None, 2412s ) -> _ConnectionFairy: 2412s if not fairy: 2412s > fairy = _ConnectionRecord.checkout(pool) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s cls = 2412s pool = 2412s 2412s @classmethod 2412s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2412s if TYPE_CHECKING: 2412s rec = cast(_ConnectionRecord, pool._do_get()) 2412s else: 2412s > rec = pool._do_get() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _do_get(self) -> ConnectionPoolEntry: 2412s > return self._create_connection() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def _create_connection(self) -> ConnectionPoolEntry: 2412s """Called by subclasses to create a new ConnectionRecord.""" 2412s 2412s > return _ConnectionRecord(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s pool = , connect = True 2412s 2412s def __init__(self, pool: Pool, connect: bool = True): 2412s self.fresh = False 2412s self.fairy_ref = None 2412s self.starttime = 0 2412s self.dbapi_connection = None 2412s 2412s self.__pool = pool 2412s if connect: 2412s > self.__connect() 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s self.dbapi_connection = connection = pool._invoke_creator(self) 2412s pool.logger.debug("Created new connection %r", connection) 2412s self.fresh = True 2412s except BaseException as e: 2412s > with util.safe_reraise(): 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s type_ = None, value = None, traceback = None 2412s 2412s def __exit__( 2412s self, 2412s type_: Optional[Type[BaseException]], 2412s value: Optional[BaseException], 2412s traceback: Optional[types.TracebackType], 2412s ) -> NoReturn: 2412s assert self._exc_info is not None 2412s # see #2703 for notes 2412s if type_ is None: 2412s exc_type, exc_value, exc_tb = self._exc_info 2412s assert exc_value is not None 2412s self._exc_info = None # remove potential circular references 2412s > raise exc_value.with_traceback(exc_tb) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s self = 2412s 2412s def __connect(self) -> None: 2412s pool = self.__pool 2412s 2412s # ensure any existing connection is removed, so that if 2412s # creator fails, this attribute stays None 2412s self.dbapi_connection = None 2412s try: 2412s self.starttime = time.time() 2412s > self.dbapi_connection = connection = pool._invoke_creator(self) 2412s 2412s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2412s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2412s 2412s connection_record = 2412s 2412s def connect( 2412s connection_record: Optional[ConnectionPoolEntry] = None, 2412s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_engine_iris' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2413s def test_sqlalchemy_read_table(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2753: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_iris' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'reque...bRequest 'postgresql_psycopg2_engine_iris' for >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'reque...bRequest 'postgresql_psycopg2_engine_iris' for >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2413s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2413s 2413s def create_and_load_iris(conn, iris_file: Path): 2413s from sqlalchemy import insert 2413s 2413s iris = iris_table_metadata() 2413s 2413s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2413s reader = csv.reader(csvfile) 2413s header = next(reader) 2413s params = [dict(zip(header, row)) for row in reader] 2413s stmt = insert(iris).values(params) 2413s > with conn.begin() as con: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __enter__(self): 2413s # do not keep args and kwds alive unnecessarily 2413s # they are only needed for recreation, which is not possible anymore 2413s del self.args, self.kwds, self.func 2413s try: 2413s > return next(self.gen) 2413s 2413s /usr/lib/python3.13/contextlib.py:141: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @contextlib.contextmanager 2413s def begin(self) -> Iterator[Connection]: 2413s """Return a context manager delivering a :class:`_engine.Connection` 2413s with a :class:`.Transaction` established. 2413s 2413s E.g.:: 2413s 2413s with engine.begin() as conn: 2413s conn.execute( 2413s text("insert into table (x, y, z) values (1, 2, 3)") 2413s ) 2413s conn.execute(text("my_special_procedure(5)")) 2413s 2413s Upon successful operation, the :class:`.Transaction` 2413s is committed. If an error is raised, the :class:`.Transaction` 2413s is rolled back. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.connect` - procure a 2413s :class:`_engine.Connection` from 2413s an :class:`_engine.Engine`. 2413s 2413s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2413s for a particular :class:`_engine.Connection`. 2413s 2413s """ 2413s > with self.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s __________ test_sqlalchemy_read_table[postgresql_psycopg2_conn_iris] ___________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn_iris' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2413s def test_sqlalchemy_read_table(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2753: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn_iris' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'reque...SubRequest 'postgresql_psycopg2_engine_iris' for >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'reque...SubRequest 'postgresql_psycopg2_engine_iris' for >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2413s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2413s 2413s def create_and_load_iris(conn, iris_file: Path): 2413s from sqlalchemy import insert 2413s 2413s iris = iris_table_metadata() 2413s 2413s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2413s reader = csv.reader(csvfile) 2413s header = next(reader) 2413s params = [dict(zip(header, row)) for row in reader] 2413s stmt = insert(iris).values(params) 2413s > with conn.begin() as con: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __enter__(self): 2413s # do not keep args and kwds alive unnecessarily 2413s # they are only needed for recreation, which is not possible anymore 2413s del self.args, self.kwds, self.func 2413s try: 2413s > return next(self.gen) 2413s 2413s /usr/lib/python3.13/contextlib.py:141: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @contextlib.contextmanager 2413s def begin(self) -> Iterator[Connection]: 2413s """Return a context manager delivering a :class:`_engine.Connection` 2413s with a :class:`.Transaction` established. 2413s 2413s E.g.:: 2413s 2413s with engine.begin() as conn: 2413s conn.execute( 2413s text("insert into table (x, y, z) values (1, 2, 3)") 2413s ) 2413s conn.execute(text("my_special_procedure(5)")) 2413s 2413s Upon successful operation, the :class:`.Transaction` 2413s is committed. If an error is raised, the :class:`.Transaction` 2413s is rolled back. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.connect` - procure a 2413s :class:`_engine.Connection` from 2413s an :class:`_engine.Engine`. 2413s 2413s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2413s for a particular :class:`_engine.Connection`. 2413s 2413s """ 2413s > with self.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ________ test_sqlalchemy_read_table_columns[mysql_pymysql_engine_iris] _________ 2413s conn = 'mysql_pymysql_engine_iris' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2413s def test_sqlalchemy_read_table_columns(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2760: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine_iris' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78b410>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78b410> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _________ test_sqlalchemy_read_table_columns[mysql_pymysql_conn_iris] __________ 2413s conn = 'mysql_pymysql_conn_iris' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2413s def test_sqlalchemy_read_table_columns(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2760: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn_iris' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78b4d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78b4d0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _____ test_sqlalchemy_read_table_columns[postgresql_psycopg2_engine_iris] ______ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_engine_iris' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2413s def test_sqlalchemy_read_table_columns(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2760: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_iris' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'reque... 'postgresql_psycopg2_engine_iris' for >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'reque... 'postgresql_psycopg2_engine_iris' for >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2413s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2413s 2413s def create_and_load_iris(conn, iris_file: Path): 2413s from sqlalchemy import insert 2413s 2413s iris = iris_table_metadata() 2413s 2413s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2413s reader = csv.reader(csvfile) 2413s header = next(reader) 2413s params = [dict(zip(header, row)) for row in reader] 2413s stmt = insert(iris).values(params) 2413s > with conn.begin() as con: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __enter__(self): 2413s # do not keep args and kwds alive unnecessarily 2413s # they are only needed for recreation, which is not possible anymore 2413s del self.args, self.kwds, self.func 2413s try: 2413s > return next(self.gen) 2413s 2413s /usr/lib/python3.13/contextlib.py:141: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @contextlib.contextmanager 2413s def begin(self) -> Iterator[Connection]: 2413s """Return a context manager delivering a :class:`_engine.Connection` 2413s with a :class:`.Transaction` established. 2413s 2413s E.g.:: 2413s 2413s with engine.begin() as conn: 2413s conn.execute( 2413s text("insert into table (x, y, z) values (1, 2, 3)") 2413s ) 2413s conn.execute(text("my_special_procedure(5)")) 2413s 2413s Upon successful operation, the :class:`.Transaction` 2413s is committed. If an error is raised, the :class:`.Transaction` 2413s is rolled back. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.connect` - procure a 2413s :class:`_engine.Connection` from 2413s an :class:`_engine.Engine`. 2413s 2413s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2413s for a particular :class:`_engine.Connection`. 2413s 2413s """ 2413s > with self.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ______ test_sqlalchemy_read_table_columns[postgresql_psycopg2_conn_iris] _______ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn_iris' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2413s def test_sqlalchemy_read_table_columns(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2760: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn_iris' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'reque...st 'postgresql_psycopg2_engine_iris' for >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'reque...st 'postgresql_psycopg2_engine_iris' for >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2413s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2413s 2413s def create_and_load_iris(conn, iris_file: Path): 2413s from sqlalchemy import insert 2413s 2413s iris = iris_table_metadata() 2413s 2413s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2413s reader = csv.reader(csvfile) 2413s header = next(reader) 2413s params = [dict(zip(header, row)) for row in reader] 2413s stmt = insert(iris).values(params) 2413s > with conn.begin() as con: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __enter__(self): 2413s # do not keep args and kwds alive unnecessarily 2413s # they are only needed for recreation, which is not possible anymore 2413s del self.args, self.kwds, self.func 2413s try: 2413s > return next(self.gen) 2413s 2413s /usr/lib/python3.13/contextlib.py:141: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @contextlib.contextmanager 2413s def begin(self) -> Iterator[Connection]: 2413s """Return a context manager delivering a :class:`_engine.Connection` 2413s with a :class:`.Transaction` established. 2413s 2413s E.g.:: 2413s 2413s with engine.begin() as conn: 2413s conn.execute( 2413s text("insert into table (x, y, z) values (1, 2, 3)") 2413s ) 2413s conn.execute(text("my_special_procedure(5)")) 2413s 2413s Upon successful operation, the :class:`.Transaction` 2413s is committed. If an error is raised, the :class:`.Transaction` 2413s is rolled back. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.connect` - procure a 2413s :class:`_engine.Connection` from 2413s an :class:`_engine.Engine`. 2413s 2413s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2413s for a particular :class:`_engine.Connection`. 2413s 2413s """ 2413s > with self.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ___________ test_read_table_absent_raises[mysql_pymysql_engine_iris] ___________ 2413s conn = 'mysql_pymysql_engine_iris' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2413s def test_read_table_absent_raises(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2769: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine_iris' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78ba70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78ba70> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ____________ test_read_table_absent_raises[mysql_pymysql_conn_iris] ____________ 2413s conn = 'mysql_pymysql_conn_iris' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2413s def test_read_table_absent_raises(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2769: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn_iris' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78bb30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea78bb30> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ________ test_read_table_absent_raises[postgresql_psycopg2_engine_iris] ________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_engine_iris' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2413s def test_read_table_absent_raises(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2769: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_iris' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'reque...quest 'postgresql_psycopg2_engine_iris' for >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'reque...quest 'postgresql_psycopg2_engine_iris' for >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2413s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2413s 2413s def create_and_load_iris(conn, iris_file: Path): 2413s from sqlalchemy import insert 2413s 2413s iris = iris_table_metadata() 2413s 2413s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2413s reader = csv.reader(csvfile) 2413s header = next(reader) 2413s params = [dict(zip(header, row)) for row in reader] 2413s stmt = insert(iris).values(params) 2413s > with conn.begin() as con: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __enter__(self): 2413s # do not keep args and kwds alive unnecessarily 2413s # they are only needed for recreation, which is not possible anymore 2413s del self.args, self.kwds, self.func 2413s try: 2413s > return next(self.gen) 2413s 2413s /usr/lib/python3.13/contextlib.py:141: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @contextlib.contextmanager 2413s def begin(self) -> Iterator[Connection]: 2413s """Return a context manager delivering a :class:`_engine.Connection` 2413s with a :class:`.Transaction` established. 2413s 2413s E.g.:: 2413s 2413s with engine.begin() as conn: 2413s conn.execute( 2413s text("insert into table (x, y, z) values (1, 2, 3)") 2413s ) 2413s conn.execute(text("my_special_procedure(5)")) 2413s 2413s Upon successful operation, the :class:`.Transaction` 2413s is committed. If an error is raised, the :class:`.Transaction` 2413s is rolled back. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.connect` - procure a 2413s :class:`_engine.Connection` from 2413s an :class:`_engine.Engine`. 2413s 2413s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2413s for a particular :class:`_engine.Connection`. 2413s 2413s """ 2413s > with self.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _________ test_read_table_absent_raises[postgresql_psycopg2_conn_iris] _________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn_iris' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_iris) 2413s def test_read_table_absent_raises(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2769: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn_iris' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_iris' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'reque...Request 'postgresql_psycopg2_engine_iris' for >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'reque...Request 'postgresql_psycopg2_engine_iris' for >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'iris_path': PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv'), 'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s iris_path = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_engine_iris(postgresql_psycopg2_engine, iris_path): 2413s > create_and_load_iris(postgresql_psycopg2_engine, iris_path) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:668: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s iris_file = PosixPath('/tmp/autopkgtest.rcV9Ni/build.Jt5/src/pandas/tests/io/data/csv/iris.csv') 2413s 2413s def create_and_load_iris(conn, iris_file: Path): 2413s from sqlalchemy import insert 2413s 2413s iris = iris_table_metadata() 2413s 2413s with iris_file.open(newline=None, encoding="utf-8") as csvfile: 2413s reader = csv.reader(csvfile) 2413s header = next(reader) 2413s params = [dict(zip(header, row)) for row in reader] 2413s stmt = insert(iris).values(params) 2413s > with conn.begin() as con: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:198: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __enter__(self): 2413s # do not keep args and kwds alive unnecessarily 2413s # they are only needed for recreation, which is not possible anymore 2413s del self.args, self.kwds, self.func 2413s try: 2413s > return next(self.gen) 2413s 2413s /usr/lib/python3.13/contextlib.py:141: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @contextlib.contextmanager 2413s def begin(self) -> Iterator[Connection]: 2413s """Return a context manager delivering a :class:`_engine.Connection` 2413s with a :class:`.Transaction` established. 2413s 2413s E.g.:: 2413s 2413s with engine.begin() as conn: 2413s conn.execute( 2413s text("insert into table (x, y, z) values (1, 2, 3)") 2413s ) 2413s conn.execute(text("my_special_procedure(5)")) 2413s 2413s Upon successful operation, the :class:`.Transaction` 2413s is committed. If an error is raised, the :class:`.Transaction` 2413s is rolled back. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.connect` - procure a 2413s :class:`_engine.Connection` from 2413s an :class:`_engine.Engine`. 2413s 2413s :meth:`_engine.Connection.begin` - start a :class:`.Transaction` 2413s for a particular :class:`_engine.Connection`. 2413s 2413s """ 2413s > with self.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3242: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s __ test_sqlalchemy_default_type_conversion[postgresql_psycopg2_engine_types] ___ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_engine_types' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_types) 2413s def test_sqlalchemy_default_type_conversion(conn, request): 2413s conn_name = conn 2413s if conn_name == "sqlite_str": 2413s pytest.skip("types tables not created in sqlite_str fixture") 2413s elif "mysql" in conn_name or "sqlite" in conn_name: 2413s request.applymarker( 2413s pytest.mark.xfail(reason="boolean dtype not inferred properly") 2413s ) 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2785: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_types' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'requ...resql_psycopg2_engine_types' for >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'requ...resql_psycopg2_engine_types' for >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2413s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2413s dialect = 'postgres' 2413s 2413s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2413s from sqlalchemy import insert 2413s from sqlalchemy.engine import Engine 2413s 2413s types = types_table_metadata(dialect) 2413s 2413s stmt = insert(types).values(types_data) 2413s if isinstance(conn, Engine): 2413s > with conn.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ___ test_sqlalchemy_default_type_conversion[postgresql_psycopg2_conn_types] ____ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn_types' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_types) 2413s def test_sqlalchemy_default_type_conversion(conn, request): 2413s conn_name = conn 2413s if conn_name == "sqlite_str": 2413s pytest.skip("types tables not created in sqlite_str fixture") 2413s elif "mysql" in conn_name or "sqlite" in conn_name: 2413s request.applymarker( 2413s pytest.mark.xfail(reason="boolean dtype not inferred properly") 2413s ) 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2785: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn_types' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'requ...tgresql_psycopg2_engine_types' for >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'requ...tgresql_psycopg2_engine_types' for >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2413s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2413s dialect = 'postgres' 2413s 2413s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2413s from sqlalchemy import insert 2413s from sqlalchemy.engine import Engine 2413s 2413s types = types_table_metadata(dialect) 2413s 2413s stmt = insert(types).values(types_data) 2413s if isinstance(conn, Engine): 2413s > with conn.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ______________________ test_bigint[mysql_pymysql_engine] _______________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_bigint(conn, request): 2413s # int64 should be converted to BigInteger, GH7433 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2801: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1905f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1905f0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _______________________ test_bigint[mysql_pymysql_conn] ________________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_bigint(conn, request): 2413s # int64 should be converted to BigInteger, GH7433 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2801: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1907d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1907d0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ___________________ test_bigint[postgresql_psycopg2_engine] ____________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_bigint(conn, request): 2413s # int64 should be converted to BigInteger, GH7433 2413s conn = request.getfixturevalue(conn) 2413s df = DataFrame(data={"i64": [2**62]}) 2413s > assert df.to_sql(name="test_bigint", con=conn, index=False) == 1 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2803: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( i64 2413s 0 4611686018427387904,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'test_bigint'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = i64 2413s 0 4611686018427387904, name = 'test_bigint' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = i64 2413s 0 4611686018427387904, name = 'test_bigint' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ____________________ test_bigint[postgresql_psycopg2_conn] _____________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_bigint(conn, request): 2413s # int64 should be converted to BigInteger, GH7433 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2801: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ______________ test_default_date_load[mysql_pymysql_engine_types] ______________ 2413s conn = 'mysql_pymysql_engine_types' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_types) 2413s def test_default_date_load(conn, request): 2413s conn_name = conn 2413s if conn_name == "sqlite_str": 2413s pytest.skip("types tables not created in sqlite_str fixture") 2413s elif "sqlite" in conn_name: 2413s request.applymarker( 2413s pytest.mark.xfail(reason="sqlite does not read date properly") 2413s ) 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2819: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine_types' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb190e30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb190e30> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _______________ test_default_date_load[mysql_pymysql_conn_types] _______________ 2413s conn = 'mysql_pymysql_conn_types' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_types) 2413s def test_default_date_load(conn, request): 2413s conn_name = conn 2413s if conn_name == "sqlite_str": 2413s pytest.skip("types tables not created in sqlite_str fixture") 2413s elif "sqlite" in conn_name: 2413s request.applymarker( 2413s pytest.mark.xfail(reason="sqlite does not read date properly") 2413s ) 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2819: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn_types' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb190f50>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb190f50> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ___________ test_default_date_load[postgresql_psycopg2_engine_types] ___________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_engine_types' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_types) 2413s def test_default_date_load(conn, request): 2413s conn_name = conn 2413s if conn_name == "sqlite_str": 2413s pytest.skip("types tables not created in sqlite_str fixture") 2413s elif "sqlite" in conn_name: 2413s request.applymarker( 2413s pytest.mark.xfail(reason="sqlite does not read date properly") 2413s ) 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2819: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_types' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'requ...SubRequest 'postgresql_psycopg2_engine_types' for >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'requ...SubRequest 'postgresql_psycopg2_engine_types' for >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2413s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2413s dialect = 'postgres' 2413s 2413s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2413s from sqlalchemy import insert 2413s from sqlalchemy.engine import Engine 2413s 2413s types = types_table_metadata(dialect) 2413s 2413s stmt = insert(types).values(types_data) 2413s if isinstance(conn, Engine): 2413s > with conn.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ____________ test_default_date_load[postgresql_psycopg2_conn_types] ____________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn_types' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_types) 2413s def test_default_date_load(conn, request): 2413s conn_name = conn 2413s if conn_name == "sqlite_str": 2413s pytest.skip("types tables not created in sqlite_str fixture") 2413s elif "sqlite" in conn_name: 2413s request.applymarker( 2413s pytest.mark.xfail(reason="sqlite does not read date properly") 2413s ) 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2819: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn_types' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'requ... >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'requ... >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2413s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2413s dialect = 'postgres' 2413s 2413s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2413s from sqlalchemy import insert 2413s from sqlalchemy.engine import Engine 2413s 2413s types = types_table_metadata(dialect) 2413s 2413s stmt = insert(types).values(types_data) 2413s if isinstance(conn, Engine): 2413s > with conn.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ______ test_datetime_with_timezone_query[None-postgresql_psycopg2_engine] ______ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s parse_dates = None 2413s 2413s @pytest.mark.parametrize("conn", postgresql_connectable) 2413s @pytest.mark.parametrize("parse_dates", [None, ["DateColWithTz"]]) 2413s def test_datetime_with_timezone_query(conn, request, parse_dates): 2413s # edge case that converts postgresql datetime with time zone types 2413s # to datetime64[ns,psycopg2.tz.FixedOffsetTimezone..], which is ok 2413s # but should be more natural, so coerce to datetime64[ns] for now 2413s conn = request.getfixturevalue(conn) 2413s > expected = create_and_load_postgres_datetz(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2832: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def create_and_load_postgres_datetz(conn): 2413s from sqlalchemy import ( 2413s Column, 2413s DateTime, 2413s MetaData, 2413s Table, 2413s insert, 2413s ) 2413s from sqlalchemy.engine import Engine 2413s 2413s metadata = MetaData() 2413s datetz = Table("datetz", metadata, Column("DateColWithTz", DateTime(timezone=True))) 2413s datetz_data = [ 2413s { 2413s "DateColWithTz": "2000-01-01 00:00:00-08:00", 2413s }, 2413s { 2413s "DateColWithTz": "2000-06-01 00:00:00-07:00", 2413s }, 2413s ] 2413s stmt = insert(datetz).values(datetz_data) 2413s if isinstance(conn, Engine): 2413s > with conn.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:351: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _______ test_datetime_with_timezone_query[None-postgresql_psycopg2_conn] _______ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s parse_dates = None 2413s 2413s @pytest.mark.parametrize("conn", postgresql_connectable) 2413s @pytest.mark.parametrize("parse_dates", [None, ["DateColWithTz"]]) 2413s def test_datetime_with_timezone_query(conn, request, parse_dates): 2413s # edge case that converts postgresql datetime with time zone types 2413s # to datetime64[ns,psycopg2.tz.FixedOffsetTimezone..], which is ok 2413s # but should be more natural, so coerce to datetime64[ns] for now 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2831: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s __ test_datetime_with_timezone_query[parse_dates1-postgresql_psycopg2_engine] __ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s parse_dates = ['DateColWithTz'] 2413s 2413s @pytest.mark.parametrize("conn", postgresql_connectable) 2413s @pytest.mark.parametrize("parse_dates", [None, ["DateColWithTz"]]) 2413s def test_datetime_with_timezone_query(conn, request, parse_dates): 2413s # edge case that converts postgresql datetime with time zone types 2413s # to datetime64[ns,psycopg2.tz.FixedOffsetTimezone..], which is ok 2413s # but should be more natural, so coerce to datetime64[ns] for now 2413s conn = request.getfixturevalue(conn) 2413s > expected = create_and_load_postgres_datetz(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2832: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def create_and_load_postgres_datetz(conn): 2413s from sqlalchemy import ( 2413s Column, 2413s DateTime, 2413s MetaData, 2413s Table, 2413s insert, 2413s ) 2413s from sqlalchemy.engine import Engine 2413s 2413s metadata = MetaData() 2413s datetz = Table("datetz", metadata, Column("DateColWithTz", DateTime(timezone=True))) 2413s datetz_data = [ 2413s { 2413s "DateColWithTz": "2000-01-01 00:00:00-08:00", 2413s }, 2413s { 2413s "DateColWithTz": "2000-06-01 00:00:00-07:00", 2413s }, 2413s ] 2413s stmt = insert(datetz).values(datetz_data) 2413s if isinstance(conn, Engine): 2413s > with conn.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:351: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ___ test_datetime_with_timezone_query[parse_dates1-postgresql_psycopg2_conn] ___ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s parse_dates = ['DateColWithTz'] 2413s 2413s @pytest.mark.parametrize("conn", postgresql_connectable) 2413s @pytest.mark.parametrize("parse_dates", [None, ["DateColWithTz"]]) 2413s def test_datetime_with_timezone_query(conn, request, parse_dates): 2413s # edge case that converts postgresql datetime with time zone types 2413s # to datetime64[ns,psycopg2.tz.FixedOffsetTimezone..], which is ok 2413s # but should be more natural, so coerce to datetime64[ns] for now 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2831: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ___ test_datetime_with_timezone_query_chunksize[postgresql_psycopg2_engine] ____ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", postgresql_connectable) 2413s def test_datetime_with_timezone_query_chunksize(conn, request): 2413s conn = request.getfixturevalue(conn) 2413s > expected = create_and_load_postgres_datetz(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2843: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def create_and_load_postgres_datetz(conn): 2413s from sqlalchemy import ( 2413s Column, 2413s DateTime, 2413s MetaData, 2413s Table, 2413s insert, 2413s ) 2413s from sqlalchemy.engine import Engine 2413s 2413s metadata = MetaData() 2413s datetz = Table("datetz", metadata, Column("DateColWithTz", DateTime(timezone=True))) 2413s datetz_data = [ 2413s { 2413s "DateColWithTz": "2000-01-01 00:00:00-08:00", 2413s }, 2413s { 2413s "DateColWithTz": "2000-06-01 00:00:00-07:00", 2413s }, 2413s ] 2413s stmt = insert(datetz).values(datetz_data) 2413s if isinstance(conn, Engine): 2413s > with conn.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:351: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ____ test_datetime_with_timezone_query_chunksize[postgresql_psycopg2_conn] _____ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", postgresql_connectable) 2413s def test_datetime_with_timezone_query_chunksize(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2842: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ________ test_datetime_with_timezone_table[postgresql_psycopg2_engine] _________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", postgresql_connectable) 2413s def test_datetime_with_timezone_table(conn, request): 2413s conn = request.getfixturevalue(conn) 2413s > expected = create_and_load_postgres_datetz(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2856: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def create_and_load_postgres_datetz(conn): 2413s from sqlalchemy import ( 2413s Column, 2413s DateTime, 2413s MetaData, 2413s Table, 2413s insert, 2413s ) 2413s from sqlalchemy.engine import Engine 2413s 2413s metadata = MetaData() 2413s datetz = Table("datetz", metadata, Column("DateColWithTz", DateTime(timezone=True))) 2413s datetz_data = [ 2413s { 2413s "DateColWithTz": "2000-01-01 00:00:00-08:00", 2413s }, 2413s { 2413s "DateColWithTz": "2000-06-01 00:00:00-07:00", 2413s }, 2413s ] 2413s stmt = insert(datetz).values(datetz_data) 2413s if isinstance(conn, Engine): 2413s > with conn.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:351: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _________ test_datetime_with_timezone_table[postgresql_psycopg2_conn] __________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", postgresql_connectable) 2413s def test_datetime_with_timezone_table(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2855: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _________ test_datetime_with_timezone_roundtrip[mysql_pymysql_engine] __________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_with_timezone_roundtrip(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2864: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb191bb0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb191bb0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s __________ test_datetime_with_timezone_roundtrip[mysql_pymysql_conn] ___________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_with_timezone_roundtrip(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2864: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb191c70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb191c70> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ______ test_datetime_with_timezone_roundtrip[postgresql_psycopg2_engine] _______ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_with_timezone_roundtrip(conn, request): 2413s conn_name = conn 2413s conn = request.getfixturevalue(conn) 2413s # GH 9086 2413s # Write datetimetz data to a db and read it back 2413s # For dbs that support timestamps with timezones, should get back UTC 2413s # otherwise naive data should be returned 2413s expected = DataFrame( 2413s {"A": date_range("2013-01-01 09:00:00", periods=3, tz="US/Pacific")} 2413s ) 2413s > assert expected.to_sql(name="test_datetime_tz", con=conn, index=False) == 3 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2872: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( A 2413s 0 2013-01-01 09:00:00-08:00 2413s 1 2013-01-02 09:00:00-08:00 2413s 2 2013-01-03 09:00:00-08:00,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'test_datetime_tz'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = A 2413s 0 2013-01-01 09:00:00-08:00 2413s 1 2013-01-02 09:00:00-08:00 2413s 2 2013-01-03 09:00:00-08:00 2413s name = 'test_datetime_tz' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = A 2413s 0 2013-01-01 09:00:00-08:00 2413s 1 2013-01-02 09:00:00-08:00 2413s 2 2013-01-03 09:00:00-08:00 2413s name = 'test_datetime_tz' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _______ test_datetime_with_timezone_roundtrip[postgresql_psycopg2_conn] ________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_with_timezone_roundtrip(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2864: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ______________ test_out_of_bounds_datetime[mysql_pymysql_engine] _______________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_out_of_bounds_datetime(conn, request): 2413s # GH 26761 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2895: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb192210>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb192210> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _______________ test_out_of_bounds_datetime[mysql_pymysql_conn] ________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_out_of_bounds_datetime(conn, request): 2413s # GH 26761 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2895: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb192330>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb192330> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ___________ test_out_of_bounds_datetime[postgresql_psycopg2_engine] ____________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_out_of_bounds_datetime(conn, request): 2413s # GH 26761 2413s conn = request.getfixturevalue(conn) 2413s data = DataFrame({"date": datetime(9999, 1, 1)}, index=[0]) 2413s > assert data.to_sql(name="test_datetime_obb", con=conn, index=False) == 1 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2897: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( date 2413s 0 9999-01-01,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'test_datetime_obb'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = date 2413s 0 9999-01-01, name = 'test_datetime_obb' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = date 2413s 0 9999-01-01, name = 'test_datetime_obb' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ____________ test_out_of_bounds_datetime[postgresql_psycopg2_conn] _____________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_out_of_bounds_datetime(conn, request): 2413s # GH 26761 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2895: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ___________ test_naive_datetimeindex_roundtrip[mysql_pymysql_engine] ___________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_naive_datetimeindex_roundtrip(conn, request): 2413s # GH 23510 2413s # Ensure that a naive DatetimeIndex isn't converted to UTC 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2907: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1929f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1929f0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ____________ test_naive_datetimeindex_roundtrip[mysql_pymysql_conn] ____________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_naive_datetimeindex_roundtrip(conn, request): 2413s # GH 23510 2413s # Ensure that a naive DatetimeIndex isn't converted to UTC 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2907: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb192ab0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb192ab0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ________ test_naive_datetimeindex_roundtrip[postgresql_psycopg2_engine] ________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_naive_datetimeindex_roundtrip(conn, request): 2413s # GH 23510 2413s # Ensure that a naive DatetimeIndex isn't converted to UTC 2413s conn = request.getfixturevalue(conn) 2413s dates = date_range("2018-01-01", periods=5, freq="6h")._with_freq(None) 2413s expected = DataFrame({"nums": range(5)}, index=dates) 2413s > assert expected.to_sql(name="foo_table", con=conn, index_label="info_date") == 5 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2910: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( nums 2413s 2018-01-01 00:00:00 0 2413s 2018-01-01 06:00:00 1 2413s 2018-01-01 12:00:00 2 2413s 2018-01-01 18:00:00 3 2413s 2018-01-02 00:00:00 4,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index_label': 'info_date', 'name': 'foo_table'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = nums 2413s 2018-01-01 00:00:00 0 2413s 2018-01-01 06:00:00 1 2413s 2018-01-01 12:00:00 2 2413s 2018-01-01 18:00:00 3 2413s 2018-01-02 00:00:00 4 2413s name = 'foo_table' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = True, index_label = 'info_date' 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = nums 2413s 2018-01-01 00:00:00 0 2413s 2018-01-01 06:00:00 1 2413s 2018-01-01 12:00:00 2 2413s 2018-01-01 18:00:00 3 2413s 2018-01-02 00:00:00 4 2413s name = 'foo_table' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = True, index_label = 'info_date' 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _________ test_naive_datetimeindex_roundtrip[postgresql_psycopg2_conn] _________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_naive_datetimeindex_roundtrip(conn, request): 2413s # GH 23510 2413s # Ensure that a naive DatetimeIndex isn't converted to UTC 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2907: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ________________ test_date_parsing[mysql_pymysql_engine_types] _________________ 2413s conn = 'mysql_pymysql_engine_types' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_types) 2413s def test_date_parsing(conn, request): 2413s # No Parsing 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2920: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine_types' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1930b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1930b0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _________________ test_date_parsing[mysql_pymysql_conn_types] __________________ 2413s conn = 'mysql_pymysql_conn_types' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_types) 2413s def test_date_parsing(conn, request): 2413s # No Parsing 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2920: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn_types' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1931d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1931d0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _____________ test_date_parsing[postgresql_psycopg2_engine_types] ______________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_engine_types' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_types) 2413s def test_date_parsing(conn, request): 2413s # No Parsing 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2920: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_types' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2413s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2413s dialect = 'postgres' 2413s 2413s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2413s from sqlalchemy import insert 2413s from sqlalchemy.engine import Engine 2413s 2413s types = types_table_metadata(dialect) 2413s 2413s stmt = insert(types).values(types_data) 2413s if isinstance(conn, Engine): 2413s > with conn.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ______________ test_date_parsing[postgresql_psycopg2_conn_types] _______________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn_types' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable_types) 2413s def test_date_parsing(conn, request): 2413s # No Parsing 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2920: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn_types' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_engine_types' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'types_data': [{'Bool...ol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}]} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_engine_types(postgresql_psycopg2_engine, types_data): 2413s > create_and_load_types(postgresql_psycopg2_engine, types_data, "postgres") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:675: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s types_data = [{'BoolCol': False, 'BoolColWithNull': False, 'DateCol': '2000-01-03 00:00:00', 'FloatCol': 10.1, ...}, {'BoolCol': False, 'BoolColWithNull': None, 'DateCol': '2000-01-04 00:00:00', 'FloatCol': 10.1, ...}] 2413s dialect = 'postgres' 2413s 2413s def create_and_load_types(conn, types_data: list[dict], dialect: str): 2413s from sqlalchemy import insert 2413s from sqlalchemy.engine import Engine 2413s 2413s types = types_table_metadata(dialect) 2413s 2413s stmt = insert(types).values(types_data) 2413s if isinstance(conn, Engine): 2413s > with conn.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:317: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _____________________ test_datetime[mysql_pymysql_engine] ______________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2951: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1937d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1937d0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ______________________ test_datetime[mysql_pymysql_conn] _______________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2951: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1938f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1938f0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s __________________ test_datetime[postgresql_psycopg2_engine] ___________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime(conn, request): 2413s conn_name = conn 2413s conn = request.getfixturevalue(conn) 2413s df = DataFrame( 2413s {"A": date_range("2013-01-01 09:00:00", periods=3), "B": np.arange(3.0)} 2413s ) 2413s > assert df.to_sql(name="test_datetime", con=conn) == 3 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2955: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( A B 2413s 0 2013-01-01 09:00:00 0.0 2413s 1 2013-01-02 09:00:00 1.0 2413s 2 2013-01-03 09:00:00 2.0,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'name': 'test_datetime'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = A B 2413s 0 2013-01-01 09:00:00 0.0 2413s 1 2013-01-02 09:00:00 1.0 2413s 2 2013-01-03 09:00:00 2.0 2413s name = 'test_datetime' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = True, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = A B 2413s 0 2013-01-01 09:00:00 0.0 2413s 1 2013-01-02 09:00:00 1.0 2413s 2 2013-01-03 09:00:00 2.0 2413s name = 'test_datetime' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = True, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ___________________ test_datetime[postgresql_psycopg2_conn] ____________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2951: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ___________________ test_datetime_NaT[mysql_pymysql_engine] ____________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_NaT(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2976: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1140b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1140b0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ____________________ test_datetime_NaT[mysql_pymysql_conn] _____________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_NaT(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2976: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10c110>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10c110> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ________________ test_datetime_NaT[postgresql_psycopg2_engine] _________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_NaT(conn, request): 2413s conn_name = conn 2413s conn = request.getfixturevalue(conn) 2413s df = DataFrame( 2413s {"A": date_range("2013-01-01 09:00:00", periods=3), "B": np.arange(3.0)} 2413s ) 2413s df.loc[1, "A"] = np.nan 2413s > assert df.to_sql(name="test_datetime", con=conn, index=False) == 3 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2981: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( A B 2413s 0 2013-01-01 09:00:00 0.0 2413s 1 NaT 1.0 2413s 2 2013-01-03 09:00:00 2.0,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'test_datetime'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = A B 2413s 0 2013-01-01 09:00:00 0.0 2413s 1 NaT 1.0 2413s 2 2013-01-03 09:00:00 2.0 2413s name = 'test_datetime' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = A B 2413s 0 2013-01-01 09:00:00 0.0 2413s 1 NaT 1.0 2413s 2 2013-01-03 09:00:00 2.0 2413s name = 'test_datetime' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _________________ test_datetime_NaT[postgresql_psycopg2_conn] __________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_NaT(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:2976: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ___________________ test_datetime_date[mysql_pymysql_engine] ___________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_date(conn, request): 2413s # test support for datetime.date 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3000: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10c950>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10c950> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ____________________ test_datetime_date[mysql_pymysql_conn] ____________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_date(conn, request): 2413s # test support for datetime.date 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3000: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10c8f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10c8f0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ________________ test_datetime_date[postgresql_psycopg2_engine] ________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_date(conn, request): 2413s # test support for datetime.date 2413s conn = request.getfixturevalue(conn) 2413s df = DataFrame([date(2014, 1, 1), date(2014, 1, 2)], columns=["a"]) 2413s > assert df.to_sql(name="test_date", con=conn, index=False) == 2 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3002: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( a 2413s 0 2014-01-01 2413s 1 2014-01-02,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'test_date'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = a 2413s 0 2014-01-01 2413s 1 2014-01-02, name = 'test_date' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = a 2413s 0 2014-01-01 2413s 1 2014-01-02, name = 'test_date' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _________________ test_datetime_date[postgresql_psycopg2_conn] _________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_date(conn, request): 2413s # test support for datetime.date 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3000: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ___________________ test_datetime_time[mysql_pymysql_engine] ___________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s sqlite_buildin = 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_time(conn, request, sqlite_buildin): 2413s # test support for datetime.time 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3014: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10d130>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10d130> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ____________________ test_datetime_time[mysql_pymysql_conn] ____________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s sqlite_buildin = 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_time(conn, request, sqlite_buildin): 2413s # test support for datetime.time 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3014: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10d0d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10d0d0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ________________ test_datetime_time[postgresql_psycopg2_engine] ________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s sqlite_buildin = 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_time(conn, request, sqlite_buildin): 2413s # test support for datetime.time 2413s conn_name = conn 2413s conn = request.getfixturevalue(conn) 2413s df = DataFrame([time(9, 0, 0), time(9, 1, 30)], columns=["a"]) 2413s > assert df.to_sql(name="test_time", con=conn, index=False) == 2 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3016: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( a 2413s 0 09:00:00 2413s 1 09:01:30,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'test_time'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = a 2413s 0 09:00:00 2413s 1 09:01:30, name = 'test_time' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = a 2413s 0 09:00:00 2413s 1 09:01:30, name = 'test_time' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _________________ test_datetime_time[postgresql_psycopg2_conn] _________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s sqlite_buildin = 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_datetime_time(conn, request, sqlite_buildin): 2413s # test support for datetime.time 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3014: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ________________ test_mixed_dtype_insert[mysql_pymysql_engine] _________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_mixed_dtype_insert(conn, request): 2413s # see GH6509 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3040: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10d850>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10d850> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _________________ test_mixed_dtype_insert[mysql_pymysql_conn] __________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_mixed_dtype_insert(conn, request): 2413s # see GH6509 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3040: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10d970>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10d970> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _____________ test_mixed_dtype_insert[postgresql_psycopg2_engine] ______________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_mixed_dtype_insert(conn, request): 2413s # see GH6509 2413s conn = request.getfixturevalue(conn) 2413s s1 = Series(2**25 + 1, dtype=np.int32) 2413s s2 = Series(0.0, dtype=np.float32) 2413s df = DataFrame({"s1": s1, "s2": s2}) 2413s 2413s # write and read again 2413s > assert df.to_sql(name="test_read_write", con=conn, index=False) == 1 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3046: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( s1 s2 2413s 0 33554433 0.0,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'test_read_write'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = s1 s2 2413s 0 33554433 0.0, name = 'test_read_write' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = s1 s2 2413s 0 33554433 0.0, name = 'test_read_write' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ______________ test_mixed_dtype_insert[postgresql_psycopg2_conn] _______________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_mixed_dtype_insert(conn, request): 2413s # see GH6509 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3040: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ____________________ test_nan_numeric[mysql_pymysql_engine] ____________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_nan_numeric(conn, request): 2413s # NaNs in numeric float column 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3055: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10df70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10df70> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _____________________ test_nan_numeric[mysql_pymysql_conn] _____________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_nan_numeric(conn, request): 2413s # NaNs in numeric float column 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3055: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10e090>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10e090> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _________________ test_nan_numeric[postgresql_psycopg2_engine] _________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_nan_numeric(conn, request): 2413s # NaNs in numeric float column 2413s conn = request.getfixturevalue(conn) 2413s df = DataFrame({"A": [0, 1, 2], "B": [0.2, np.nan, 5.6]}) 2413s > assert df.to_sql(name="test_nan", con=conn, index=False) == 3 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3057: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( A B 2413s 0 0 0.2 2413s 1 1 NaN 2413s 2 2 5.6,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'test_nan'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = A B 2413s 0 0 0.2 2413s 1 1 NaN 2413s 2 2 5.6, name = 'test_nan' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = A B 2413s 0 0 0.2 2413s 1 1 NaN 2413s 2 2 5.6, name = 'test_nan' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s __________________ test_nan_numeric[postgresql_psycopg2_conn] __________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_nan_numeric(conn, request): 2413s # NaNs in numeric float column 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3055: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s __________________ test_nan_fullcolumn[mysql_pymysql_engine] ___________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_nan_fullcolumn(conn, request): 2413s # full NaN column (numeric float column) 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3071: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10e810>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10e810> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ___________________ test_nan_fullcolumn[mysql_pymysql_conn] ____________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_nan_fullcolumn(conn, request): 2413s # full NaN column (numeric float column) 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3071: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10e870>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10e870> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _______________ test_nan_fullcolumn[postgresql_psycopg2_engine] ________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_nan_fullcolumn(conn, request): 2413s # full NaN column (numeric float column) 2413s conn = request.getfixturevalue(conn) 2413s df = DataFrame({"A": [0, 1, 2], "B": [np.nan, np.nan, np.nan]}) 2413s > assert df.to_sql(name="test_nan", con=conn, index=False) == 3 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3073: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( A B 2413s 0 0 NaN 2413s 1 1 NaN 2413s 2 2 NaN,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'test_nan'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = A B 2413s 0 0 NaN 2413s 1 1 NaN 2413s 2 2 NaN, name = 'test_nan' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = A B 2413s 0 0 NaN 2413s 1 1 NaN 2413s 2 2 NaN, name = 'test_nan' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ________________ test_nan_fullcolumn[postgresql_psycopg2_conn] _________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_nan_fullcolumn(conn, request): 2413s # full NaN column (numeric float column) 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3071: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ____________________ test_nan_string[mysql_pymysql_engine] _____________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_nan_string(conn, request): 2413s # NaNs in string column 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3089: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10ef30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10ef30> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _____________________ test_nan_string[mysql_pymysql_conn] ______________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_nan_string(conn, request): 2413s # NaNs in string column 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3089: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10f050>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10f050> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _________________ test_nan_string[postgresql_psycopg2_engine] __________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_nan_string(conn, request): 2413s # NaNs in string column 2413s conn = request.getfixturevalue(conn) 2413s df = DataFrame({"A": [0, 1, 2], "B": ["a", "b", np.nan]}) 2413s > assert df.to_sql(name="test_nan", con=conn, index=False) == 3 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( A B 2413s 0 0 a 2413s 1 1 b 2413s 2 2 NaN,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'test_nan'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = A B 2413s 0 0 a 2413s 1 1 b 2413s 2 2 NaN, name = 'test_nan' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = A B 2413s 0 0 a 2413s 1 1 b 2413s 2 2 NaN, name = 'test_nan' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s __________________ test_nan_string[postgresql_psycopg2_conn] ___________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_nan_string(conn, request): 2413s # NaNs in string column 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3089: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _________________ test_to_sql_save_index[mysql_pymysql_engine] _________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_to_sql_save_index(conn, request): 2413s if "adbc" in conn: 2413s request.node.add_marker( 2413s pytest.mark.xfail( 2413s reason="ADBC implementation does not create index", strict=True 2413s ) 2413s ) 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3114: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10f7d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10f7d0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s __________________ test_to_sql_save_index[mysql_pymysql_conn] __________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_to_sql_save_index(conn, request): 2413s if "adbc" in conn: 2413s request.node.add_marker( 2413s pytest.mark.xfail( 2413s reason="ADBC implementation does not create index", strict=True 2413s ) 2413s ) 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3114: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10f8f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb10f8f0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ______________ test_to_sql_save_index[postgresql_psycopg2_engine] ______________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_to_sql_save_index(conn, request): 2413s if "adbc" in conn: 2413s request.node.add_marker( 2413s pytest.mark.xfail( 2413s reason="ADBC implementation does not create index", strict=True 2413s ) 2413s ) 2413s conn_name = conn 2413s conn = request.getfixturevalue(conn) 2413s df = DataFrame.from_records( 2413s [(1, 2.1, "line1"), (2, 1.5, "line2")], columns=["A", "B", "C"], index=["A"] 2413s ) 2413s 2413s tbl_name = "test_to_sql_saves_index" 2413s > with pandasSQL_builder(conn) as pandasSQL: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _______________ test_to_sql_save_index[postgresql_psycopg2_conn] _______________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_to_sql_save_index(conn, request): 2413s if "adbc" in conn: 2413s request.node.add_marker( 2413s pytest.mark.xfail( 2413s reason="ADBC implementation does not create index", strict=True 2413s ) 2413s ) 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3114: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ___________________ test_transactions[mysql_pymysql_engine] ____________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_transactions(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a4290>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a4290> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ____________________ test_transactions[mysql_pymysql_conn] _____________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_transactions(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a43b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a43b0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ________________ test_transactions[postgresql_psycopg2_engine] _________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_transactions(conn, request): 2413s conn_name = conn 2413s conn = request.getfixturevalue(conn) 2413s 2413s stmt = "CREATE TABLE test_trans (A INT, B TEXT)" 2413s if conn_name != "sqlite_buildin" and "adbc" not in conn_name: 2413s from sqlalchemy import text 2413s 2413s stmt = text(stmt) 2413s 2413s > with pandasSQL_builder(conn) as pandasSQL: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3156: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _________________ test_transactions[postgresql_psycopg2_conn] __________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_transactions(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _______________ test_transaction_rollback[mysql_pymysql_engine] ________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_transaction_rollback(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3164: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a4e90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a4e90> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ________________ test_transaction_rollback[mysql_pymysql_conn] _________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_transaction_rollback(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3164: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a4fb0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a4fb0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ____________ test_transaction_rollback[postgresql_psycopg2_engine] _____________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_transaction_rollback(conn, request): 2413s conn_name = conn 2413s conn = request.getfixturevalue(conn) 2413s > with pandasSQL_builder(conn) as pandasSQL: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3165: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _____________ test_transaction_rollback[postgresql_psycopg2_conn] ______________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_transaction_rollback(conn, request): 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3164: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ______________ test_get_schema_create_table[mysql_pymysql_engine] ______________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s test_frame3 = index A B 2413s 0 2000-01-03 00:00:00 2147483647 -1.987670 2413s 1 2000-01-04 00:00:00 -29 -0.041232 2413s 2 2000-01-05 00:00:00 20000 0.731168 2413s 3 2000-01-06 00:00:00 -290867 1.567621 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_get_schema_create_table(conn, request, test_frame3): 2413s # Use a dataframe without a bool column, since MySQL converts bool to 2413s # TINYINT (which read_sql_table returns as an int and causes a dtype 2413s # mismatch) 2413s if conn == "sqlite_str": 2413s request.applymarker( 2413s pytest.mark.xfail(reason="test does not support sqlite_str fixture") 2413s ) 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3213: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a5b50>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a5b50> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _______________ test_get_schema_create_table[mysql_pymysql_conn] _______________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s test_frame3 = index A B 2413s 0 2000-01-03 00:00:00 2147483647 -1.987670 2413s 1 2000-01-04 00:00:00 -29 -0.041232 2413s 2 2000-01-05 00:00:00 20000 0.731168 2413s 3 2000-01-06 00:00:00 -290867 1.567621 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_get_schema_create_table(conn, request, test_frame3): 2413s # Use a dataframe without a bool column, since MySQL converts bool to 2413s # TINYINT (which read_sql_table returns as an int and causes a dtype 2413s # mismatch) 2413s if conn == "sqlite_str": 2413s request.applymarker( 2413s pytest.mark.xfail(reason="test does not support sqlite_str fixture") 2413s ) 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3213: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a5c70>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a5c70> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ___________ test_get_schema_create_table[postgresql_psycopg2_engine] ___________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s test_frame3 = index A B 2413s 0 2000-01-03 00:00:00 2147483647 -1.987670 2413s 1 2000-01-04 00:00:00 -29 -0.041232 2413s 2 2000-01-05 00:00:00 20000 0.731168 2413s 3 2000-01-06 00:00:00 -290867 1.567621 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_get_schema_create_table(conn, request, test_frame3): 2413s # Use a dataframe without a bool column, since MySQL converts bool to 2413s # TINYINT (which read_sql_table returns as an int and causes a dtype 2413s # mismatch) 2413s if conn == "sqlite_str": 2413s request.applymarker( 2413s pytest.mark.xfail(reason="test does not support sqlite_str fixture") 2413s ) 2413s 2413s conn = request.getfixturevalue(conn) 2413s 2413s from sqlalchemy import text 2413s from sqlalchemy.engine import Engine 2413s 2413s tbl = "test_get_schema_create_table" 2413s > create_sql = sql.get_schema(test_frame3, tbl, con=conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3219: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = index A B 2413s 0 2000-01-03 00:00:00 2147483647 -1.987670 2413s 1 2000-01-04 00:00:00 -29 -0.041232 2413s 2 2000-01-05 00:00:00 20000 0.731168 2413s 3 2000-01-06 00:00:00 -290867 1.567621 2413s name = 'test_get_schema_create_table', keys = None 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s dtype = None, schema = None 2413s 2413s def get_schema( 2413s frame, 2413s name: str, 2413s keys=None, 2413s con=None, 2413s dtype: DtypeArg | None = None, 2413s schema: str | None = None, 2413s ) -> str: 2413s """ 2413s Get the SQL db table schema for the given frame. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame 2413s name : str 2413s name of SQL table 2413s keys : string or sequence, default: None 2413s columns to use a primary key 2413s con: ADBC Connection, SQLAlchemy connectable, sqlite3 connection, default: None 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s dtype : dict of column name to SQL type, default None 2413s Optional specifying the datatype for columns. The SQL type should 2413s be a SQLAlchemy type, or a string for sqlite3 fallback connection. 2413s schema: str, default: None 2413s Optional specifying the schema to be used in creating the table. 2413s """ 2413s > with pandasSQL_builder(con=con) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:2923: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ____________ test_get_schema_create_table[postgresql_psycopg2_conn] ____________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s test_frame3 = index A B 2413s 0 2000-01-03 00:00:00 2147483647 -1.987670 2413s 1 2000-01-04 00:00:00 -29 -0.041232 2413s 2 2000-01-05 00:00:00 20000 0.731168 2413s 3 2000-01-06 00:00:00 -290867 1.567621 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_get_schema_create_table(conn, request, test_frame3): 2413s # Use a dataframe without a bool column, since MySQL converts bool to 2413s # TINYINT (which read_sql_table returns as an int and causes a dtype 2413s # mismatch) 2413s if conn == "sqlite_str": 2413s request.applymarker( 2413s pytest.mark.xfail(reason="test does not support sqlite_str fixture") 2413s ) 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3213: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _______________________ test_dtype[mysql_pymysql_engine] _______________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_dtype(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("sqlite_str has no inspection system") 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3238: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a60f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a60f0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ________________________ test_dtype[mysql_pymysql_conn] ________________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_dtype(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("sqlite_str has no inspection system") 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3238: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a62d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a62d0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ____________________ test_dtype[postgresql_psycopg2_engine] ____________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_dtype(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("sqlite_str has no inspection system") 2413s 2413s conn = request.getfixturevalue(conn) 2413s 2413s from sqlalchemy import ( 2413s TEXT, 2413s String, 2413s ) 2413s from sqlalchemy.schema import MetaData 2413s 2413s cols = ["A", "B"] 2413s data = [(0.8, True), (0.9, None)] 2413s df = DataFrame(data, columns=cols) 2413s > assert df.to_sql(name="dtype_test", con=conn) == 2 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3249: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( A B 2413s 0 0.8 True 2413s 1 0.9 None,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'name': 'dtype_test'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = A B 2413s 0 0.8 True 2413s 1 0.9 None, name = 'dtype_test' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = True, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = A B 2413s 0 0.8 True 2413s 1 0.9 None, name = 'dtype_test' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = True, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _____________________ test_dtype[postgresql_psycopg2_conn] _____________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_dtype(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("sqlite_str has no inspection system") 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3238: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ____________________ test_notna_dtype[mysql_pymysql_engine] ____________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_notna_dtype(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("sqlite_str has no inspection system") 2413s 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3281: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a6870>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a6870> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _____________________ test_notna_dtype[mysql_pymysql_conn] _____________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_notna_dtype(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("sqlite_str has no inspection system") 2413s 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3281: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a6990>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a6990> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _________________ test_notna_dtype[postgresql_psycopg2_engine] _________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_notna_dtype(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("sqlite_str has no inspection system") 2413s 2413s conn_name = conn 2413s conn = request.getfixturevalue(conn) 2413s 2413s from sqlalchemy import ( 2413s Boolean, 2413s DateTime, 2413s Float, 2413s Integer, 2413s ) 2413s from sqlalchemy.schema import MetaData 2413s 2413s cols = { 2413s "Bool": Series([True, None]), 2413s "Date": Series([datetime(2012, 5, 1), None]), 2413s "Int": Series([1, None], dtype="object"), 2413s "Float": Series([1.1, None]), 2413s } 2413s df = DataFrame(cols) 2413s 2413s tbl = "notna_dtype_test" 2413s > assert df.to_sql(name=tbl, con=conn) == 2 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3300: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( Bool Date Int Float 2413s 0 True 2012-05-01 1 1.1 2413s 1 None NaT None NaN,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'name': 'notna_dtype_test'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Bool Date Int Float 2413s 0 True 2012-05-01 1 1.1 2413s 1 None NaT None NaN 2413s name = 'notna_dtype_test' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = True, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = Bool Date Int Float 2413s 0 True 2012-05-01 1 1.1 2413s 1 None NaT None NaN 2413s name = 'notna_dtype_test' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = True, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s __________________ test_notna_dtype[postgresql_psycopg2_conn] __________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_notna_dtype(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("sqlite_str has no inspection system") 2413s 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3281: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _________________ test_double_precision[mysql_pymysql_engine] __________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_double_precision(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("sqlite_str has no inspection system") 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3317: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a7170>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a7170> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s __________________ test_double_precision[mysql_pymysql_conn] ___________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_double_precision(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("sqlite_str has no inspection system") 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3317: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a7290>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a7290> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ______________ test_double_precision[postgresql_psycopg2_engine] _______________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_double_precision(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("sqlite_str has no inspection system") 2413s 2413s conn = request.getfixturevalue(conn) 2413s 2413s from sqlalchemy import ( 2413s BigInteger, 2413s Float, 2413s Integer, 2413s ) 2413s from sqlalchemy.schema import MetaData 2413s 2413s V = 1.23456789101112131415 2413s 2413s df = DataFrame( 2413s { 2413s "f32": Series([V], dtype="float32"), 2413s "f64": Series([V], dtype="float64"), 2413s "f64_as_f32": Series([V], dtype="float64"), 2413s "i32": Series([5], dtype="int32"), 2413s "i64": Series([5], dtype="int64"), 2413s } 2413s ) 2413s 2413s > assert ( 2413s df.to_sql( 2413s name="test_dtypes", 2413s con=conn, 2413s index=False, 2413s if_exists="replace", 2413s dtype={"f64_as_f32": Float(precision=23)}, 2413s ) 2413s == 1 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3338: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( f32 f64 f64_as_f32 i32 i64 2413s 0 1.234568 1.234568 1.234568 5 5,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'dtype': {'f64_as_f32': Float(precision=23)}, 'if_exists': 'replace', 'index': False, ...} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = f32 f64 f64_as_f32 i32 i64 2413s 0 1.234568 1.234568 1.234568 5 5 2413s name = 'test_dtypes' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'replace', index = False, index_label = None 2413s chunksize = None, dtype = {'f64_as_f32': Float(precision=23)}, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = f32 f64 f64_as_f32 i32 i64 2413s 0 1.234568 1.234568 1.234568 5 5 2413s name = 'test_dtypes' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'replace', index = False, index_label = None 2413s chunksize = None, dtype = {'f64_as_f32': Float(precision=23)}, method = None 2413s engine = 'auto', engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _______________ test_double_precision[postgresql_psycopg2_conn] ________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_double_precision(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("sqlite_str has no inspection system") 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3317: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _____________ test_connectable_issue_example[mysql_pymysql_engine] _____________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_connectable_issue_example(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3366: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a79b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a79b0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ______________ test_connectable_issue_example[mysql_pymysql_conn] ______________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_connectable_issue_example(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3366: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a7ad0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9eb1a7ad0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s __________ test_connectable_issue_example[postgresql_psycopg2_engine] __________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_connectable_issue_example(conn, request): 2413s conn = request.getfixturevalue(conn) 2413s 2413s # This tests the example raised in issue 2413s # https://github.com/pandas-dev/pandas/issues/10104 2413s from sqlalchemy.engine import Engine 2413s 2413s def test_select(connection): 2413s query = "SELECT test_foo_data FROM test_foo_data" 2413s return sql.read_sql_query(query, con=connection) 2413s 2413s def test_append(connection, data): 2413s data.to_sql(name="test_foo_data", con=connection, if_exists="append") 2413s 2413s def test_connectable(conn): 2413s # https://github.com/sqlalchemy/sqlalchemy/commit/ 2413s # 00b5c10846e800304caa86549ab9da373b42fa5d#r48323973 2413s foo_data = test_select(conn) 2413s test_append(conn, foo_data) 2413s 2413s def main(connectable): 2413s if isinstance(connectable, Engine): 2413s with connectable.connect() as conn: 2413s with conn.begin(): 2413s test_connectable(conn) 2413s else: 2413s test_connectable(connectable) 2413s 2413s > assert ( 2413s DataFrame({"test_foo_data": [0, 1, 2]}).to_sql(name="test_foo_data", con=conn) 2413s == 3 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3393: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( test_foo_data 2413s 0 0 2413s 1 1 2413s 2 2,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'name': 'test_foo_data'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = test_foo_data 2413s 0 0 2413s 1 1 2413s 2 2 2413s name = 'test_foo_data' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = True, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = test_foo_data 2413s 0 0 2413s 1 1 2413s 2 2 2413s name = 'test_foo_data' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = True, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ___________ test_connectable_issue_example[postgresql_psycopg2_conn] ___________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_connectable_issue_example(conn, request): 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3366: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _________ test_to_sql_with_negative_npinf[input0-mysql_pymysql_engine] _________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s input = {'foo': [inf]} 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s @pytest.mark.parametrize( 2413s "input", 2413s [{"foo": [np.inf]}, {"foo": [-np.inf]}, {"foo": [-np.inf], "infe0": ["bar"]}], 2413s ) 2413s def test_to_sql_with_negative_npinf(conn, request, input): 2413s # GH 34431 2413s 2413s df = DataFrame(input) 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3410: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d0110>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d0110> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s __________ test_to_sql_with_negative_npinf[input0-mysql_pymysql_conn] __________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s input = {'foo': [inf]} 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s @pytest.mark.parametrize( 2413s "input", 2413s [{"foo": [np.inf]}, {"foo": [-np.inf]}, {"foo": [-np.inf], "infe0": ["bar"]}], 2413s ) 2413s def test_to_sql_with_negative_npinf(conn, request, input): 2413s # GH 34431 2413s 2413s df = DataFrame(input) 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3410: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d01d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d01d0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ______ test_to_sql_with_negative_npinf[input0-postgresql_psycopg2_engine] ______ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s input = {'foo': [inf]} 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s @pytest.mark.parametrize( 2413s "input", 2413s [{"foo": [np.inf]}, {"foo": [-np.inf]}, {"foo": [-np.inf], "infe0": ["bar"]}], 2413s ) 2413s def test_to_sql_with_negative_npinf(conn, request, input): 2413s # GH 34431 2413s 2413s df = DataFrame(input) 2413s conn_name = conn 2413s conn = request.getfixturevalue(conn) 2413s 2413s if "mysql" in conn_name: 2413s # GH 36465 2413s # The input {"foo": [-np.inf], "infe0": ["bar"]} does not raise any error 2413s # for pymysql version >= 0.10 2413s # TODO(GH#36465): remove this version check after GH 36465 is fixed 2413s pymysql = td.versioned_importorskip("pymysql") 2413s 2413s if Version(pymysql.__version__) < Version("1.0.3") and "infe0" in df.columns: 2413s mark = pytest.mark.xfail(reason="GH 36465") 2413s request.applymarker(mark) 2413s 2413s msg = "inf cannot be used with MySQL" 2413s with pytest.raises(ValueError, match=msg): 2413s df.to_sql(name="foobar", con=conn, index=False) 2413s else: 2413s > assert df.to_sql(name="foobar", con=conn, index=False) == 1 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3427: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( foo 2413s 0 inf,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'foobar'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = foo 2413s 0 inf, name = 'foobar' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = foo 2413s 0 inf, name = 'foobar' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _______ test_to_sql_with_negative_npinf[input0-postgresql_psycopg2_conn] _______ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s input = {'foo': [inf]} 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s @pytest.mark.parametrize( 2413s "input", 2413s [{"foo": [np.inf]}, {"foo": [-np.inf]}, {"foo": [-np.inf], "infe0": ["bar"]}], 2413s ) 2413s def test_to_sql_with_negative_npinf(conn, request, input): 2413s # GH 34431 2413s 2413s df = DataFrame(input) 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3410: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _________ test_to_sql_with_negative_npinf[input1-mysql_pymysql_engine] _________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s input = {'foo': [-inf]} 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s @pytest.mark.parametrize( 2413s "input", 2413s [{"foo": [np.inf]}, {"foo": [-np.inf]}, {"foo": [-np.inf], "infe0": ["bar"]}], 2413s ) 2413s def test_to_sql_with_negative_npinf(conn, request, input): 2413s # GH 34431 2413s 2413s df = DataFrame(input) 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3410: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d0770>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d0770> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s __________ test_to_sql_with_negative_npinf[input1-mysql_pymysql_conn] __________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s input = {'foo': [-inf]} 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s @pytest.mark.parametrize( 2413s "input", 2413s [{"foo": [np.inf]}, {"foo": [-np.inf]}, {"foo": [-np.inf], "infe0": ["bar"]}], 2413s ) 2413s def test_to_sql_with_negative_npinf(conn, request, input): 2413s # GH 34431 2413s 2413s df = DataFrame(input) 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3410: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d0830>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d0830> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ______ test_to_sql_with_negative_npinf[input1-postgresql_psycopg2_engine] ______ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s input = {'foo': [-inf]} 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s @pytest.mark.parametrize( 2413s "input", 2413s [{"foo": [np.inf]}, {"foo": [-np.inf]}, {"foo": [-np.inf], "infe0": ["bar"]}], 2413s ) 2413s def test_to_sql_with_negative_npinf(conn, request, input): 2413s # GH 34431 2413s 2413s df = DataFrame(input) 2413s conn_name = conn 2413s conn = request.getfixturevalue(conn) 2413s 2413s if "mysql" in conn_name: 2413s # GH 36465 2413s # The input {"foo": [-np.inf], "infe0": ["bar"]} does not raise any error 2413s # for pymysql version >= 0.10 2413s # TODO(GH#36465): remove this version check after GH 36465 is fixed 2413s pymysql = td.versioned_importorskip("pymysql") 2413s 2413s if Version(pymysql.__version__) < Version("1.0.3") and "infe0" in df.columns: 2413s mark = pytest.mark.xfail(reason="GH 36465") 2413s request.applymarker(mark) 2413s 2413s msg = "inf cannot be used with MySQL" 2413s with pytest.raises(ValueError, match=msg): 2413s df.to_sql(name="foobar", con=conn, index=False) 2413s else: 2413s > assert df.to_sql(name="foobar", con=conn, index=False) == 1 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3427: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( foo 2413s 0 -inf,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'foobar'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = foo 2413s 0 -inf, name = 'foobar' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = foo 2413s 0 -inf, name = 'foobar' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _______ test_to_sql_with_negative_npinf[input1-postgresql_psycopg2_conn] _______ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s input = {'foo': [-inf]} 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s @pytest.mark.parametrize( 2413s "input", 2413s [{"foo": [np.inf]}, {"foo": [-np.inf]}, {"foo": [-np.inf], "infe0": ["bar"]}], 2413s ) 2413s def test_to_sql_with_negative_npinf(conn, request, input): 2413s # GH 34431 2413s 2413s df = DataFrame(input) 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3410: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _________ test_to_sql_with_negative_npinf[input2-mysql_pymysql_engine] _________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s input = {'foo': [-inf], 'infe0': ['bar']} 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s @pytest.mark.parametrize( 2413s "input", 2413s [{"foo": [np.inf]}, {"foo": [-np.inf]}, {"foo": [-np.inf], "infe0": ["bar"]}], 2413s ) 2413s def test_to_sql_with_negative_npinf(conn, request, input): 2413s # GH 34431 2413s 2413s df = DataFrame(input) 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3410: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d0fb0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d0fb0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s __________ test_to_sql_with_negative_npinf[input2-mysql_pymysql_conn] __________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s input = {'foo': [-inf], 'infe0': ['bar']} 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s @pytest.mark.parametrize( 2413s "input", 2413s [{"foo": [np.inf]}, {"foo": [-np.inf]}, {"foo": [-np.inf], "infe0": ["bar"]}], 2413s ) 2413s def test_to_sql_with_negative_npinf(conn, request, input): 2413s # GH 34431 2413s 2413s df = DataFrame(input) 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3410: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d1070>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d1070> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ______ test_to_sql_with_negative_npinf[input2-postgresql_psycopg2_engine] ______ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s input = {'foo': [-inf], 'infe0': ['bar']} 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s @pytest.mark.parametrize( 2413s "input", 2413s [{"foo": [np.inf]}, {"foo": [-np.inf]}, {"foo": [-np.inf], "infe0": ["bar"]}], 2413s ) 2413s def test_to_sql_with_negative_npinf(conn, request, input): 2413s # GH 34431 2413s 2413s df = DataFrame(input) 2413s conn_name = conn 2413s conn = request.getfixturevalue(conn) 2413s 2413s if "mysql" in conn_name: 2413s # GH 36465 2413s # The input {"foo": [-np.inf], "infe0": ["bar"]} does not raise any error 2413s # for pymysql version >= 0.10 2413s # TODO(GH#36465): remove this version check after GH 36465 is fixed 2413s pymysql = td.versioned_importorskip("pymysql") 2413s 2413s if Version(pymysql.__version__) < Version("1.0.3") and "infe0" in df.columns: 2413s mark = pytest.mark.xfail(reason="GH 36465") 2413s request.applymarker(mark) 2413s 2413s msg = "inf cannot be used with MySQL" 2413s with pytest.raises(ValueError, match=msg): 2413s df.to_sql(name="foobar", con=conn, index=False) 2413s else: 2413s > assert df.to_sql(name="foobar", con=conn, index=False) == 1 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3427: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ( foo infe0 2413s 0 -inf bar,) 2413s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'index': False, 'name': 'foobar'} 2413s 2413s @wraps(func) 2413s def wrapper(*args, **kwargs): 2413s if len(args) > num_allow_args: 2413s warnings.warn( 2413s msg.format(arguments=_format_argument_list(allow_args)), 2413s FutureWarning, 2413s stacklevel=find_stack_level(), 2413s ) 2413s > return func(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = foo infe0 2413s 0 -inf bar, name = 'foobar' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None 2413s 2413s @final 2413s @deprecate_nonkeyword_arguments( 2413s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2413s ) 2413s def to_sql( 2413s self, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool_t = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2413s newly created, appended to, or overwritten. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s Name of SQL table. 2413s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. Legacy support is provided for sqlite3.Connection objects. The user 2413s is responsible for engine disposal and connection closure for the SQLAlchemy 2413s connectable. See `here \ 2413s `_. 2413s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2413s the transaction will not be committed. If passing a sqlite3.Connection, 2413s it will not be possible to roll back the record insertion. 2413s 2413s schema : str, optional 2413s Specify the schema (if database flavor supports this). If None, use 2413s default schema. 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s How to behave if the table already exists. 2413s 2413s * fail: Raise a ValueError. 2413s * replace: Drop the table before inserting new values. 2413s * append: Insert new values to the existing table. 2413s 2413s index : bool, default True 2413s Write DataFrame index as a column. Uses `index_label` as the column 2413s name in the table. Creates a table index for this column. 2413s index_label : str or sequence, default None 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s * None : Uses standard SQL ``INSERT`` clause (one per row). 2413s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2413s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s The number of returned rows affected is the sum of the ``rowcount`` 2413s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2413s reflect the exact number of written rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Raises 2413s ------ 2413s ValueError 2413s When the table already exists and `if_exists` is 'fail' (the 2413s default). 2413s 2413s See Also 2413s -------- 2413s read_sql : Read a DataFrame from a table. 2413s 2413s Notes 2413s ----- 2413s Timezone aware datetime columns will be written as 2413s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2413s database. Otherwise, the datetimes will be stored as timezone unaware 2413s timestamps local to the original timezone. 2413s 2413s Not all datastores support ``method="multi"``. Oracle, for example, 2413s does not support multi-value insert. 2413s 2413s References 2413s ---------- 2413s .. [1] https://docs.sqlalchemy.org 2413s .. [2] https://www.python.org/dev/peps/pep-0249/ 2413s 2413s Examples 2413s -------- 2413s Create an in-memory SQLite database. 2413s 2413s >>> from sqlalchemy import create_engine 2413s >>> engine = create_engine('sqlite://', echo=False) 2413s 2413s Create a table from scratch with 3 rows. 2413s 2413s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2413s >>> df 2413s name 2413s 0 User 1 2413s 1 User 2 2413s 2 User 3 2413s 2413s >>> df.to_sql(name='users', con=engine) 2413s 3 2413s >>> from sqlalchemy import text 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2413s 2413s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2413s 2413s >>> with engine.begin() as connection: 2413s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2413s ... df1.to_sql(name='users', con=connection, if_exists='append') 2413s 2 2413s 2413s This is allowed to support operations that require that the same 2413s DBAPI connection is used for the entire operation. 2413s 2413s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2413s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2413s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2413s (1, 'User 7')] 2413s 2413s Overwrite the table with just ``df2``. 2413s 2413s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2413s ... index_label='id') 2413s 2 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM users")).fetchall() 2413s [(0, 'User 6'), (1, 'User 7')] 2413s 2413s Use ``method`` to define a callable insertion method to do nothing 2413s if there's a primary key conflict on a table in a PostgreSQL database. 2413s 2413s >>> from sqlalchemy.dialects.postgresql import insert 2413s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2413s ... # "a" is the primary key in "conflict_table" 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2413s 0 2413s 2413s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2413s on a primary key. 2413s 2413s >>> from sqlalchemy.dialects.mysql import insert 2413s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2413s ... # update columns "b" and "c" on primary key conflict 2413s ... data = [dict(zip(keys, row)) for row in data_iter] 2413s ... stmt = ( 2413s ... insert(table.table) 2413s ... .values(data) 2413s ... ) 2413s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2413s ... result = conn.execute(stmt) 2413s ... return result.rowcount 2413s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2413s 2 2413s 2413s Specify the dtype (especially useful for integers with missing values). 2413s Notice that while pandas is forced to store the data as floating point, 2413s the database supports nullable integers. When fetching the data with 2413s Python, we get back integer scalars. 2413s 2413s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2413s >>> df 2413s A 2413s 0 1.0 2413s 1 NaN 2413s 2 2.0 2413s 2413s >>> from sqlalchemy.types import Integer 2413s >>> df.to_sql(name='integers', con=engine, index=False, 2413s ... dtype={"A": Integer()}) 2413s 3 2413s 2413s >>> with engine.connect() as conn: 2413s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2413s [(1,), (None,), (2,)] 2413s """ # noqa: E501 2413s from pandas.io import sql 2413s 2413s > return sql.to_sql( 2413s self, 2413s name, 2413s con, 2413s schema=schema, 2413s if_exists=if_exists, 2413s index=index, 2413s index_label=index_label, 2413s chunksize=chunksize, 2413s dtype=dtype, 2413s method=method, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s frame = foo infe0 2413s 0 -inf bar, name = 'foobar' 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, if_exists = 'fail', index = False, index_label = None 2413s chunksize = None, dtype = None, method = None, engine = 'auto' 2413s engine_kwargs = {} 2413s 2413s def to_sql( 2413s frame, 2413s name: str, 2413s con, 2413s schema: str | None = None, 2413s if_exists: Literal["fail", "replace", "append"] = "fail", 2413s index: bool = True, 2413s index_label: IndexLabel | None = None, 2413s chunksize: int | None = None, 2413s dtype: DtypeArg | None = None, 2413s method: Literal["multi"] | Callable | None = None, 2413s engine: str = "auto", 2413s **engine_kwargs, 2413s ) -> int | None: 2413s """ 2413s Write records stored in a DataFrame to a SQL database. 2413s 2413s Parameters 2413s ---------- 2413s frame : DataFrame, Series 2413s name : str 2413s Name of SQL table. 2413s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2413s or sqlite3 DBAPI2 connection 2413s ADBC provides high performance I/O with native type support, where available. 2413s Using SQLAlchemy makes it possible to use any DB supported by that 2413s library. 2413s If a DBAPI2 object, only sqlite3 is supported. 2413s schema : str, optional 2413s Name of SQL schema in database to write to (if database flavor 2413s supports this). If None, use default schema (default). 2413s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2413s - fail: If table exists, do nothing. 2413s - replace: If table exists, drop it, recreate it, and insert data. 2413s - append: If table exists, insert data. Create if does not exist. 2413s index : bool, default True 2413s Write DataFrame index as a column. 2413s index_label : str or sequence, optional 2413s Column label for index column(s). If None is given (default) and 2413s `index` is True, then the index names are used. 2413s A sequence should be given if the DataFrame uses MultiIndex. 2413s chunksize : int, optional 2413s Specify the number of rows in each batch to be written at a time. 2413s By default, all rows will be written at once. 2413s dtype : dict or scalar, optional 2413s Specifying the datatype for columns. If a dictionary is used, the 2413s keys should be the column names and the values should be the 2413s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2413s scalar is provided, it will be applied to all columns. 2413s method : {None, 'multi', callable}, optional 2413s Controls the SQL insertion clause used: 2413s 2413s - None : Uses standard SQL ``INSERT`` clause (one per row). 2413s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2413s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2413s 2413s Details and a sample callable implementation can be found in the 2413s section :ref:`insert method `. 2413s engine : {'auto', 'sqlalchemy'}, default 'auto' 2413s SQL engine library to use. If 'auto', then the option 2413s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2413s behavior is 'sqlalchemy' 2413s 2413s .. versionadded:: 1.3.0 2413s 2413s **engine_kwargs 2413s Any additional kwargs are passed to the engine. 2413s 2413s Returns 2413s ------- 2413s None or int 2413s Number of rows affected by to_sql. None is returned if the callable 2413s passed into ``method`` does not return an integer number of rows. 2413s 2413s .. versionadded:: 1.4.0 2413s 2413s Notes 2413s ----- 2413s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2413s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2413s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2413s rows as stipulated in the 2413s `sqlite3 `__ or 2413s `SQLAlchemy `__ 2413s """ # noqa: E501 2413s if if_exists not in ("fail", "replace", "append"): 2413s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2413s 2413s if isinstance(frame, Series): 2413s frame = frame.to_frame() 2413s elif not isinstance(frame, DataFrame): 2413s raise NotImplementedError( 2413s "'frame' argument should be either a Series or a DataFrame" 2413s ) 2413s 2413s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = True 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s _______ test_to_sql_with_negative_npinf[input2-postgresql_psycopg2_conn] _______ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s input = {'foo': [-inf], 'infe0': ['bar']} 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s @pytest.mark.parametrize( 2413s "input", 2413s [{"foo": [np.inf]}, {"foo": [-np.inf]}, {"foo": [-np.inf], "infe0": ["bar"]}], 2413s ) 2413s def test_to_sql_with_negative_npinf(conn, request, input): 2413s # GH 34431 2413s 2413s df = DataFrame(input) 2413s conn_name = conn 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3410: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s __________________ test_temporary_table[mysql_pymysql_engine] __________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_temporary_table(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("test does not work with str connection") 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3437: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d1430>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d1430> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ___________________ test_temporary_table[mysql_pymysql_conn] ___________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_temporary_table(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("test does not work with str connection") 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3437: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d1610>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d1610> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _______________ test_temporary_table[postgresql_psycopg2_engine] _______________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_temporary_table(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("test does not work with str connection") 2413s 2413s conn = request.getfixturevalue(conn) 2413s 2413s from sqlalchemy import ( 2413s Column, 2413s Integer, 2413s Unicode, 2413s select, 2413s ) 2413s from sqlalchemy.orm import ( 2413s Session, 2413s declarative_base, 2413s ) 2413s 2413s test_data = "Hello, World!" 2413s expected = DataFrame({"spam": [test_data]}) 2413s Base = declarative_base() 2413s 2413s class Temporary(Base): 2413s __tablename__ = "temp_test" 2413s __table_args__ = {"prefixes": ["TEMPORARY"]} 2413s id = Column(Integer, primary_key=True) 2413s spam = Column(Unicode(30), nullable=False) 2413s 2413s with Session(conn) as session: 2413s with session.begin(): 2413s > conn = session.connection() 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3462: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s bind_arguments = None, execution_options = None 2413s 2413s def connection( 2413s self, 2413s bind_arguments: Optional[_BindArguments] = None, 2413s execution_options: Optional[CoreExecuteOptionsParameter] = None, 2413s ) -> Connection: 2413s r"""Return a :class:`_engine.Connection` object corresponding to this 2413s :class:`.Session` object's transactional state. 2413s 2413s Either the :class:`_engine.Connection` corresponding to the current 2413s transaction is returned, or if no transaction is in progress, a new 2413s one is begun and the :class:`_engine.Connection` 2413s returned (note that no 2413s transactional state is established with the DBAPI until the first 2413s SQL statement is emitted). 2413s 2413s Ambiguity in multi-bind or unbound :class:`.Session` objects can be 2413s resolved through any of the optional keyword arguments. This 2413s ultimately makes usage of the :meth:`.get_bind` method for resolution. 2413s 2413s :param bind_arguments: dictionary of bind arguments. May include 2413s "mapper", "bind", "clause", other custom arguments that are passed 2413s to :meth:`.Session.get_bind`. 2413s 2413s :param execution_options: a dictionary of execution options that will 2413s be passed to :meth:`_engine.Connection.execution_options`, **when the 2413s connection is first procured only**. If the connection is already 2413s present within the :class:`.Session`, a warning is emitted and 2413s the arguments are ignored. 2413s 2413s .. seealso:: 2413s 2413s :ref:`session_transaction_isolation` 2413s 2413s """ 2413s 2413s if bind_arguments: 2413s bind = bind_arguments.pop("bind", None) 2413s 2413s if bind is None: 2413s bind = self.get_bind(**bind_arguments) 2413s else: 2413s bind = self.get_bind() 2413s 2413s > return self._connection_for_bind( 2413s bind, 2413s execution_options=execution_options, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/orm/session.py:2090: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s execution_options = None, kw = {} 2413s trans = 2413s 2413s def _connection_for_bind( 2413s self, 2413s engine: _SessionBind, 2413s execution_options: Optional[CoreExecuteOptionsParameter] = None, 2413s **kw: Any, 2413s ) -> Connection: 2413s TransactionalContext._trans_ctx_check(self) 2413s 2413s trans = self._transaction 2413s if trans is None: 2413s trans = self._autobegin_t() 2413s > return trans._connection_for_bind(engine, execution_options) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/orm/session.py:2106: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s execution_options = None 2413s 2413s > ??? 2413s 2413s :2: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fn = 2413s self = 2413s arg = (Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), None) 2413s kw = {}, current_state = 2413s next_state = <_StateChangeStates.ANY: 1>, existing_fn = None 2413s expect_state = 2413s 2413s @util.decorator 2413s def _go(fn: _F, self: Any, *arg: Any, **kw: Any) -> Any: 2413s current_state = self._state 2413s 2413s if ( 2413s has_prerequisite_states 2413s and current_state not in prerequisite_state_collection 2413s ): 2413s self._raise_for_prerequisite_state(fn.__name__, current_state) 2413s 2413s next_state = self._next_state 2413s existing_fn = self._current_fn 2413s expect_state = moves_to if expect_state_change else current_state 2413s 2413s if ( 2413s # destination states are restricted 2413s next_state is not _StateChangeStates.ANY 2413s # method seeks to change state 2413s and expect_state_change 2413s # destination state incorrect 2413s and next_state is not expect_state 2413s ): 2413s if existing_fn and next_state in ( 2413s _StateChangeStates.NO_CHANGE, 2413s _StateChangeStates.CHANGE_IN_PROGRESS, 2413s ): 2413s raise sa_exc.IllegalStateChangeError( 2413s f"Method '{fn.__name__}()' can't be called here; " 2413s f"method '{existing_fn.__name__}()' is already " 2413s f"in progress and this would cause an unexpected " 2413s f"state change to {moves_to!r}", 2413s code="isce", 2413s ) 2413s else: 2413s raise sa_exc.IllegalStateChangeError( 2413s f"Cant run operation '{fn.__name__}()' here; " 2413s f"will move to state {moves_to!r} where we are " 2413s f"expecting {next_state!r}", 2413s code="isce", 2413s ) 2413s 2413s self._current_fn = fn 2413s self._next_state = _StateChangeStates.CHANGE_IN_PROGRESS 2413s try: 2413s > ret_value = fn(self, *arg, **kw) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/orm/state_changes.py:139: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s bind = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s execution_options = None 2413s 2413s @_StateChange.declare_states( 2413s (SessionTransactionState.ACTIVE,), _StateChangeStates.NO_CHANGE 2413s ) 2413s def _connection_for_bind( 2413s self, 2413s bind: _SessionBind, 2413s execution_options: Optional[CoreExecuteOptionsParameter], 2413s ) -> Connection: 2413s if bind in self._connections: 2413s if execution_options: 2413s util.warn( 2413s "Connection is already established for the " 2413s "given bind; execution_options ignored" 2413s ) 2413s return self._connections[bind][0] 2413s 2413s self._state = SessionTransactionState.PROVISIONING_CONNECTION 2413s 2413s local_connect = False 2413s should_commit = True 2413s 2413s try: 2413s if self._parent: 2413s conn = self._parent._connection_for_bind( 2413s bind, execution_options 2413s ) 2413s if not self.nested: 2413s return conn 2413s else: 2413s if isinstance(bind, engine.Connection): 2413s conn = bind 2413s if conn.engine in self._connections: 2413s raise sa_exc.InvalidRequestError( 2413s "Session already has a Connection associated " 2413s "for the given Connection's Engine" 2413s ) 2413s else: 2413s > conn = bind.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/orm/session.py:1189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ________________ test_temporary_table[postgresql_psycopg2_conn] ________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_temporary_table(conn, request): 2413s if conn == "sqlite_str": 2413s pytest.skip("test does not work with str connection") 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3437: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s __________________ test_invalid_engine[mysql_pymysql_engine] ___________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s test_frame1 = index A B C D 2413s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2413s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_invalid_engine(conn, request, test_frame1): 2413s if conn == "sqlite_buildin" or "adbc" in conn: 2413s request.applymarker( 2413s pytest.mark.xfail( 2413s reason="SQLiteDatabase/ADBCDatabase does not raise for bad engine" 2413s ) 2413s ) 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3479: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d2150>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d2150> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ___________________ test_invalid_engine[mysql_pymysql_conn] ____________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s test_frame1 = index A B C D 2413s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2413s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_invalid_engine(conn, request, test_frame1): 2413s if conn == "sqlite_buildin" or "adbc" in conn: 2413s request.applymarker( 2413s pytest.mark.xfail( 2413s reason="SQLiteDatabase/ADBCDatabase does not raise for bad engine" 2413s ) 2413s ) 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3479: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d22d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d22d0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _______________ test_invalid_engine[postgresql_psycopg2_engine] ________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s test_frame1 = index A B C D 2413s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2413s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_invalid_engine(conn, request, test_frame1): 2413s if conn == "sqlite_buildin" or "adbc" in conn: 2413s request.applymarker( 2413s pytest.mark.xfail( 2413s reason="SQLiteDatabase/ADBCDatabase does not raise for bad engine" 2413s ) 2413s ) 2413s 2413s conn = request.getfixturevalue(conn) 2413s msg = "engine must be one of 'auto', 'sqlalchemy'" 2413s > with pandasSQL_builder(conn) as pandasSQL: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3481: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ________________ test_invalid_engine[postgresql_psycopg2_conn] _________________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s test_frame1 = index A B C D 2413s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2413s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_invalid_engine(conn, request, test_frame1): 2413s if conn == "sqlite_buildin" or "adbc" in conn: 2413s request.applymarker( 2413s pytest.mark.xfail( 2413s reason="SQLiteDatabase/ADBCDatabase does not raise for bad engine" 2413s ) 2413s ) 2413s 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3479: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ______________ test_to_sql_with_sql_engine[mysql_pymysql_engine] _______________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s test_frame1 = index A B C D 2413s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2413s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_to_sql_with_sql_engine(conn, request, test_frame1): 2413s """`to_sql` with the `engine` param""" 2413s # mostly copied from this class's `_to_sql()` method 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3490: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d2c90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d2c90> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _______________ test_to_sql_with_sql_engine[mysql_pymysql_conn] ________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s test_frame1 = index A B C D 2413s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2413s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_to_sql_with_sql_engine(conn, request, test_frame1): 2413s """`to_sql` with the `engine` param""" 2413s # mostly copied from this class's `_to_sql()` method 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3490: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d2db0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d2db0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s ___________ test_to_sql_with_sql_engine[postgresql_psycopg2_engine] ____________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s test_frame1 = index A B C D 2413s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2413s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_to_sql_with_sql_engine(conn, request, test_frame1): 2413s """`to_sql` with the `engine` param""" 2413s # mostly copied from this class's `_to_sql()` method 2413s conn = request.getfixturevalue(conn) 2413s > with pandasSQL_builder(conn) as pandasSQL: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3491: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ____________ test_to_sql_with_sql_engine[postgresql_psycopg2_conn] _____________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s test_frame1 = index A B C D 2413s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2413s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2413s 2413s @pytest.mark.parametrize("conn", all_connectable) 2413s def test_to_sql_with_sql_engine(conn, request, test_frame1): 2413s """`to_sql` with the `engine` param""" 2413s # mostly copied from this class's `_to_sql()` method 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3490: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s @pytest.fixture 2413s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2413s > with postgresql_psycopg2_engine.connect() as conn: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ________________ test_options_sqlalchemy[mysql_pymysql_engine] _________________ 2413s conn = 'mysql_pymysql_engine' 2413s request = > 2413s test_frame1 = index A B C D 2413s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2413s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_options_sqlalchemy(conn, request, test_frame1): 2413s # use the set option 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3504: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d37d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d37d0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _________________ test_options_sqlalchemy[mysql_pymysql_conn] __________________ 2413s conn = 'mysql_pymysql_conn' 2413s request = > 2413s test_frame1 = index A B C D 2413s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2413s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_options_sqlalchemy(conn, request, test_frame1): 2413s # use the set option 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3504: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s > fixturedef = request._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'mysql_pymysql_engine' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2413s not self.is_historic() 2413s ), "Cannot directly call a historic hook - use call_historic instead." 2413s self._verify_all_args_are_provided(kwargs) 2413s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2413s # Copy because plugins may register other plugins during iteration (#438). 2413s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2413s hook_name = 'pytest_fixture_setup' 2413s methods = [>] 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def _hookexec( 2413s self, 2413s hook_name: str, 2413s methods: Sequence[HookImpl], 2413s kwargs: Mapping[str, object], 2413s firstresult: bool, 2413s ) -> object | list[object]: 2413s # called from all hookcaller instances. 2413s # enable_tracing will set its own wrapping function at self._inner_hookexec 2413s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2413s 2413s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s @pytest.hookimpl(wrapper=True) 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[object], request: SubRequest 2413s ) -> Generator[None, object, object]: 2413s try: 2413s > return (yield) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturedef = 2413s request = > 2413s 2413s def pytest_fixture_setup( 2413s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2413s ) -> FixtureValue: 2413s """Execution of fixture setup.""" 2413s kwargs = {} 2413s for argname in fixturedef.argnames: 2413s kwargs[argname] = request.getfixturevalue(argname) 2413s 2413s fixturefunc = resolve_fixture_function(fixturedef, request) 2413s my_cache_key = fixturedef.cache_key(request) 2413s try: 2413s > result = call_fixture_func(fixturefunc, request, kwargs) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s fixturefunc = 2413s request = > 2413s kwargs = {} 2413s 2413s def call_fixture_func( 2413s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2413s ) -> FixtureValue: 2413s if is_generator(fixturefunc): 2413s fixturefunc = cast( 2413s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2413s ) 2413s generator = fixturefunc(**kwargs) 2413s try: 2413s > fixture_result = next(generator) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s @pytest.fixture 2413s def mysql_pymysql_engine(): 2413s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2413s > pymysql = td.versioned_importorskip("pymysql") 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s args = ('pymysql',), kwargs = {} 2413s 2413s def versioned_importorskip(*args, **kwargs): 2413s """ 2413s (warning - this is currently Debian-specific, the name may change if upstream request this) 2413s 2413s Return the requested module, or skip the test if it is 2413s not available in a new enough version. 2413s 2413s Intended as a replacement for pytest.importorskip that 2413s defaults to requiring at least pandas' minimum version for that 2413s optional dependency, rather than any version. 2413s 2413s See import_optional_dependency for full parameter documentation. 2413s """ 2413s try: 2413s > module = import_optional_dependency(*args, **kwargs) 2413s 2413s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2413s 2413s def import_optional_dependency( 2413s name: str, 2413s extra: str = "", 2413s errors: str = "raise", 2413s min_version: str | None = None, 2413s ): 2413s """ 2413s Import an optional dependency. 2413s 2413s By default, if a dependency is missing an ImportError with a nice 2413s message will be raised. If a dependency is present, but too old, 2413s we raise. 2413s 2413s Parameters 2413s ---------- 2413s name : str 2413s The module name. 2413s extra : str 2413s Additional text to include in the ImportError message. 2413s errors : str {'raise', 'warn', 'ignore'} 2413s What to do when a dependency is not found or its version is too old. 2413s 2413s * raise : Raise an ImportError 2413s * warn : Only applicable when a module's version is to old. 2413s Warns that the version is too old and returns None 2413s * ignore: If the module is not installed, return None, otherwise, 2413s return the module, even if the version is too old. 2413s It's expected that users validate the version locally when 2413s using ``errors="ignore"`` (see. ``io/html.py``) 2413s min_version : str, default None 2413s Specify a minimum version that is different from the global pandas 2413s minimum version required. 2413s Returns 2413s ------- 2413s maybe_module : Optional[ModuleType] 2413s The imported module, when found and the version is correct. 2413s None is returned when the package is not found and `errors` 2413s is False, or when the package's version is too old and `errors` 2413s is ``'warn'`` or ``'ignore'``. 2413s """ 2413s assert errors in {"warn", "raise", "ignore"} 2413s if name=='numba' and warn_numba_platform: 2413s warnings.warn(warn_numba_platform) 2413s 2413s package_name = INSTALL_MAPPING.get(name) 2413s install_name = package_name if package_name is not None else name 2413s 2413s msg = ( 2413s f"Missing optional dependency '{install_name}'. {extra} " 2413s f"Use pip or conda to install {install_name}." 2413s ) 2413s try: 2413s > module = importlib.import_module(name) 2413s 2413s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None 2413s 2413s def import_module(name, package=None): 2413s """Import a module. 2413s 2413s The 'package' argument is required when performing a relative import. It 2413s specifies the package to use as the anchor point from which to resolve the 2413s relative import to an absolute import. 2413s 2413s """ 2413s level = 0 2413s if name.startswith('.'): 2413s if not package: 2413s raise TypeError("the 'package' argument is required to perform a " 2413s f"relative import for {name!r}") 2413s for character in name: 2413s if character != '.': 2413s break 2413s level += 1 2413s > return _bootstrap._gcd_import(name[level:], package, level) 2413s 2413s /usr/lib/python3.13/importlib/__init__.py:88: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', package = None, level = 0 2413s 2413s > ??? 2413s 2413s :1387: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1360: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s name = 'pymysql', import_ = 2413s 2413s > ??? 2413s 2413s :1331: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d38f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2413s 2413s > ??? 2413s 2413s :935: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d38f0> 2413s module = 2413s 2413s > ??? 2413s 2413s :1022: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s f = 2413s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2413s kwds = {} 2413s 2413s > ??? 2413s 2413s :488: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s PyMySQL: A pure-Python MySQL client library. 2413s 2413s Copyright (c) 2010-2016 PyMySQL contributors 2413s 2413s Permission is hereby granted, free of charge, to any person obtaining a copy 2413s of this software and associated documentation files (the "Software"), to deal 2413s in the Software without restriction, including without limitation the rights 2413s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2413s copies of the Software, and to permit persons to whom the Software is 2413s furnished to do so, subject to the following conditions: 2413s 2413s The above copyright notice and this permission notice shall be included in 2413s all copies or substantial portions of the Software. 2413s 2413s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2413s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2413s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2413s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2413s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2413s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2413s THE SOFTWARE. 2413s """ 2413s 2413s import sys 2413s 2413s from .constants import FIELD_TYPE 2413s from .err import ( 2413s Warning, 2413s Error, 2413s InterfaceError, 2413s DataError, 2413s DatabaseError, 2413s OperationalError, 2413s IntegrityError, 2413s InternalError, 2413s NotSupportedError, 2413s ProgrammingError, 2413s MySQLError, 2413s ) 2413s from .times import ( 2413s Date, 2413s Time, 2413s Timestamp, 2413s DateFromTicks, 2413s TimeFromTicks, 2413s TimestampFromTicks, 2413s ) 2413s 2413s # PyMySQL version. 2413s # Used by setuptools and connection_attrs 2413s VERSION = (1, 1, 1, "final", 1) 2413s VERSION_STRING = "1.1.1" 2413s 2413s ### for mysqlclient compatibility 2413s ### Django checks mysqlclient version. 2413s version_info = (1, 4, 6, "final", 1) 2413s __version__ = "1.4.6" 2413s 2413s 2413s def get_client_info(): # for MySQLdb compatibility 2413s return __version__ 2413s 2413s 2413s def install_as_MySQLdb(): 2413s """ 2413s After this function is called, any application that imports MySQLdb 2413s will unwittingly actually use pymysql. 2413s """ 2413s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2413s 2413s 2413s # end of mysqlclient compatibility code 2413s 2413s threadsafety = 1 2413s apilevel = "2.0" 2413s paramstyle = "pyformat" 2413s 2413s > from . import connections # noqa: E402 2413s 2413s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # Python implementation of the MySQL client-server protocol 2413s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2413s # Error codes: 2413s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2413s import errno 2413s import os 2413s import socket 2413s import struct 2413s import sys 2413s import traceback 2413s import warnings 2413s 2413s > from . import _auth 2413s 2413s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s """ 2413s Implements auth methods 2413s """ 2413s 2413s from .err import OperationalError 2413s 2413s 2413s try: 2413s from cryptography.hazmat.backends import default_backend 2413s > from cryptography.hazmat.primitives import serialization, hashes 2413s 2413s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s > from cryptography.hazmat.primitives._serialization import ( 2413s BestAvailableEncryption, 2413s Encoding, 2413s KeySerializationEncryption, 2413s NoEncryption, 2413s ParameterFormat, 2413s PrivateFormat, 2413s PublicFormat, 2413s _KeySerializationEncryption, 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography import utils 2413s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s # This file is dual licensed under the terms of the Apache License, Version 2413s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2413s # for complete details. 2413s 2413s from __future__ import annotations 2413s 2413s import abc 2413s 2413s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2413s 2413s __all__ = [ 2413s "HashAlgorithm", 2413s "HashContext", 2413s "Hash", 2413s "ExtendableOutputFunction", 2413s "SHA1", 2413s "SHA512_224", 2413s "SHA512_256", 2413s "SHA224", 2413s "SHA256", 2413s "SHA384", 2413s "SHA512", 2413s "SHA3_224", 2413s "SHA3_256", 2413s "SHA3_384", 2413s "SHA3_512", 2413s "SHAKE128", 2413s "SHAKE256", 2413s "MD5", 2413s "BLAKE2b", 2413s "BLAKE2s", 2413s "SM3", 2413s ] 2413s 2413s 2413s class HashAlgorithm(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def name(self) -> str: 2413s """ 2413s A string naming this algorithm (e.g. "sha256", "md5"). 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def digest_size(self) -> int: 2413s """ 2413s The size of the resulting digest in bytes. 2413s """ 2413s 2413s @property 2413s @abc.abstractmethod 2413s def block_size(self) -> int | None: 2413s """ 2413s The internal block size of the hash function, or None if the hash 2413s function does not use blocks internally (e.g. SHA3). 2413s """ 2413s 2413s 2413s class HashContext(metaclass=abc.ABCMeta): 2413s @property 2413s @abc.abstractmethod 2413s def algorithm(self) -> HashAlgorithm: 2413s """ 2413s A HashAlgorithm that will be used by this context. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def update(self, data: bytes) -> None: 2413s """ 2413s Processes the provided bytes through the hash. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def finalize(self) -> bytes: 2413s """ 2413s Finalizes the hash context and returns the hash digest as bytes. 2413s """ 2413s 2413s @abc.abstractmethod 2413s def copy(self) -> HashContext: 2413s """ 2413s Return a HashContext that is a copy of the current context. 2413s """ 2413s 2413s 2413s > Hash = rust_openssl.hashes.Hash 2413s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2413s 2413s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2413s _____________ test_options_sqlalchemy[postgresql_psycopg2_engine] ______________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s request = > 2413s test_frame1 = index A B C D 2413s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2413s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_options_sqlalchemy(conn, request, test_frame1): 2413s # use the set option 2413s conn = request.getfixturevalue(conn) 2413s with pd.option_context("io.sql.engine", "sqlalchemy"): 2413s > with pandasSQL_builder(conn) as pandasSQL: 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3506: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def pandasSQL_builder( 2413s con, 2413s schema: str | None = None, 2413s need_transaction: bool = False, 2413s ) -> PandasSQL: 2413s """ 2413s Convenience function to return the correct PandasSQL subclass based on the 2413s provided parameters. Also creates a sqlalchemy connection and transaction 2413s if necessary. 2413s """ 2413s import sqlite3 2413s 2413s if isinstance(con, sqlite3.Connection) or con is None: 2413s return SQLiteDatabase(con) 2413s 2413s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2413s 2413s if isinstance(con, str) and sqlalchemy is None: 2413s raise ImportError("Using URI string without sqlalchemy installed.") 2413s 2413s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2413s > return SQLDatabase(con, schema, need_transaction) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s schema = None, need_transaction = False 2413s 2413s def __init__( 2413s self, con, schema: str | None = None, need_transaction: bool = False 2413s ) -> None: 2413s from sqlalchemy import create_engine 2413s from sqlalchemy.engine import Engine 2413s from sqlalchemy.schema import MetaData 2413s 2413s # self.exit_stack cleans up the Engine and Connection and commits the 2413s # transaction if any of those objects was created below. 2413s # Cleanup happens either in self.__exit__ or at the end of the iterator 2413s # returned by read_sql when chunksize is not None. 2413s self.exit_stack = ExitStack() 2413s if isinstance(con, str): 2413s con = create_engine(con) 2413s self.exit_stack.callback(con.dispose) 2413s if isinstance(con, Engine): 2413s > con = self.exit_stack.enter_context(con.connect()) 2413s 2413s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def connect(self) -> Connection: 2413s """Return a new :class:`_engine.Connection` object. 2413s 2413s The :class:`_engine.Connection` acts as a Python context manager, so 2413s the typical use of this method looks like:: 2413s 2413s with engine.connect() as connection: 2413s connection.execute(text("insert into table values ('foo')")) 2413s connection.commit() 2413s 2413s Where above, after the block is completed, the connection is "closed" 2413s and its underlying DBAPI resources are returned to the connection pool. 2413s This also has the effect of rolling back any transaction that 2413s was explicitly begun or was begun via autobegin, and will 2413s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2413s started and is still in progress. 2413s 2413s .. seealso:: 2413s 2413s :meth:`_engine.Engine.begin` 2413s 2413s """ 2413s 2413s > return self._connection_cls(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s self._dbapi_connection = engine.raw_connection() 2413s except dialect.loaded_dbapi.Error as err: 2413s > Connection._handle_dbapi_exception_noconnection( 2413s err, dialect, engine 2413s ) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2413s dialect = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2413s 2413s @classmethod 2413s def _handle_dbapi_exception_noconnection( 2413s cls, 2413s e: BaseException, 2413s dialect: Dialect, 2413s engine: Optional[Engine] = None, 2413s is_disconnect: Optional[bool] = None, 2413s invalidate_pool_on_disconnect: bool = True, 2413s is_pre_ping: bool = False, 2413s ) -> NoReturn: 2413s exc_info = sys.exc_info() 2413s 2413s if is_disconnect is None: 2413s is_disconnect = isinstance( 2413s e, dialect.loaded_dbapi.Error 2413s ) and dialect.is_disconnect(e, None, None) 2413s 2413s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2413s 2413s if should_wrap: 2413s sqlalchemy_exception = exc.DBAPIError.instance( 2413s None, 2413s None, 2413s cast(Exception, e), 2413s dialect.loaded_dbapi.Error, 2413s hide_parameters=( 2413s engine.hide_parameters if engine is not None else False 2413s ), 2413s connection_invalidated=is_disconnect, 2413s dialect=dialect, 2413s ) 2413s else: 2413s sqlalchemy_exception = None 2413s 2413s newraise = None 2413s 2413s if dialect._has_events: 2413s ctx = ExceptionContextImpl( 2413s e, 2413s sqlalchemy_exception, 2413s engine, 2413s dialect, 2413s None, 2413s None, 2413s None, 2413s None, 2413s None, 2413s is_disconnect, 2413s invalidate_pool_on_disconnect, 2413s is_pre_ping, 2413s ) 2413s for fn in dialect.dispatch.handle_error: 2413s try: 2413s # handler returns an exception; 2413s # call next handler in a chain 2413s per_fn = fn(ctx) 2413s if per_fn is not None: 2413s ctx.chained_exception = newraise = per_fn 2413s except Exception as _raised: 2413s # handler raises an exception - stop processing 2413s newraise = _raised 2413s break 2413s 2413s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2413s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2413s ctx.is_disconnect 2413s ) 2413s 2413s if newraise: 2413s raise newraise.with_traceback(exc_info[2]) from e 2413s elif should_wrap: 2413s assert sqlalchemy_exception is not None 2413s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E 2413s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s ______________ test_options_sqlalchemy[postgresql_psycopg2_conn] _______________ 2413s self = 2413s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s connection = None, _has_events = None, _allow_revalidate = True 2413s _allow_autobegin = True 2413s 2413s def __init__( 2413s self, 2413s engine: Engine, 2413s connection: Optional[PoolProxiedConnection] = None, 2413s _has_events: Optional[bool] = None, 2413s _allow_revalidate: bool = True, 2413s _allow_autobegin: bool = True, 2413s ): 2413s """Construct a new Connection.""" 2413s self.engine = engine 2413s self.dialect = dialect = engine.dialect 2413s 2413s if connection is None: 2413s try: 2413s > self._dbapi_connection = engine.raw_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2413s 2413s def raw_connection(self) -> PoolProxiedConnection: 2413s """Return a "raw" DBAPI connection from the connection pool. 2413s 2413s The returned object is a proxied version of the DBAPI 2413s connection object used by the underlying driver in use. 2413s The object will have all the same behavior as the real DBAPI 2413s connection, except that its ``close()`` method will result in the 2413s connection being returned to the pool, rather than being closed 2413s for real. 2413s 2413s This method provides direct DBAPI connection access for 2413s special situations when the API provided by 2413s :class:`_engine.Connection` 2413s is not needed. When a :class:`_engine.Connection` object is already 2413s present, the DBAPI connection is available using 2413s the :attr:`_engine.Connection.connection` accessor. 2413s 2413s .. seealso:: 2413s 2413s :ref:`dbapi_connections` 2413s 2413s """ 2413s > return self.pool.connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def connect(self) -> PoolProxiedConnection: 2413s """Return a DBAPI connection from the pool. 2413s 2413s The connection is instrumented such that when its 2413s ``close()`` method is called, the connection will be returned to 2413s the pool. 2413s 2413s """ 2413s > return _ConnectionFairy._checkout(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s threadconns = None, fairy = None 2413s 2413s @classmethod 2413s def _checkout( 2413s cls, 2413s pool: Pool, 2413s threadconns: Optional[threading.local] = None, 2413s fairy: Optional[_ConnectionFairy] = None, 2413s ) -> _ConnectionFairy: 2413s if not fairy: 2413s > fairy = _ConnectionRecord.checkout(pool) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s cls = 2413s pool = 2413s 2413s @classmethod 2413s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2413s if TYPE_CHECKING: 2413s rec = cast(_ConnectionRecord, pool._do_get()) 2413s else: 2413s > rec = pool._do_get() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _do_get(self) -> ConnectionPoolEntry: 2413s > return self._create_connection() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def _create_connection(self) -> ConnectionPoolEntry: 2413s """Called by subclasses to create a new ConnectionRecord.""" 2413s 2413s > return _ConnectionRecord(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s pool = , connect = True 2413s 2413s def __init__(self, pool: Pool, connect: bool = True): 2413s self.fresh = False 2413s self.fairy_ref = None 2413s self.starttime = 0 2413s self.dbapi_connection = None 2413s 2413s self.__pool = pool 2413s if connect: 2413s > self.__connect() 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s self.dbapi_connection = connection = pool._invoke_creator(self) 2413s pool.logger.debug("Created new connection %r", connection) 2413s self.fresh = True 2413s except BaseException as e: 2413s > with util.safe_reraise(): 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s type_ = None, value = None, traceback = None 2413s 2413s def __exit__( 2413s self, 2413s type_: Optional[Type[BaseException]], 2413s value: Optional[BaseException], 2413s traceback: Optional[types.TracebackType], 2413s ) -> NoReturn: 2413s assert self._exc_info is not None 2413s # see #2703 for notes 2413s if type_ is None: 2413s exc_type, exc_value, exc_tb = self._exc_info 2413s assert exc_value is not None 2413s self._exc_info = None # remove potential circular references 2413s > raise exc_value.with_traceback(exc_tb) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s 2413s def __connect(self) -> None: 2413s pool = self.__pool 2413s 2413s # ensure any existing connection is removed, so that if 2413s # creator fails, this attribute stays None 2413s self.dbapi_connection = None 2413s try: 2413s self.starttime = time.time() 2413s > self.dbapi_connection = connection = pool._invoke_creator(self) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s connection_record = 2413s 2413s def connect( 2413s connection_record: Optional[ConnectionPoolEntry] = None, 2413s ) -> DBAPIConnection: 2413s if dialect._has_events: 2413s for fn in dialect.dispatch.do_connect: 2413s connection = cast( 2413s DBAPIConnection, 2413s fn(dialect, connection_record, cargs, cparams), 2413s ) 2413s if connection is not None: 2413s return connection 2413s 2413s > return dialect.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s cargs = () 2413s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s 2413s def connect(self, *cargs, **cparams): 2413s # inherits the docstring from interfaces.Dialect.connect 2413s > return self.loaded_dbapi.connect(*cargs, **cparams) 2413s 2413s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2413s connection_factory = None, cursor_factory = None 2413s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2413s kwasync = {} 2413s 2413s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2413s """ 2413s Create a new database connection. 2413s 2413s The connection parameters can be specified as a string: 2413s 2413s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2413s 2413s or using a set of keyword arguments: 2413s 2413s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2413s 2413s Or as a mix of both. The basic connection parameters are: 2413s 2413s - *dbname*: the database name 2413s - *database*: the database name (only as keyword argument) 2413s - *user*: user name used to authenticate 2413s - *password*: password used to authenticate 2413s - *host*: database host address (defaults to UNIX socket if not provided) 2413s - *port*: connection port number (defaults to 5432 if not provided) 2413s 2413s Using the *connection_factory* parameter a different class or connections 2413s factory can be specified. It should be a callable object taking a dsn 2413s argument. 2413s 2413s Using the *cursor_factory* parameter, a new default cursor factory will be 2413s used by cursor(). 2413s 2413s Using *async*=True an asynchronous connection will be created. *async_* is 2413s a valid alias (for Python versions where ``async`` is a keyword). 2413s 2413s Any other keyword parameter will be passed to the underlying client 2413s library: the list of supported parameters depends on the library version. 2413s 2413s """ 2413s kwasync = {} 2413s if 'async' in kwargs: 2413s kwasync['async'] = kwargs.pop('async') 2413s if 'async_' in kwargs: 2413s kwasync['async_'] = kwargs.pop('async_') 2413s 2413s dsn = _ext.make_dsn(dsn, **kwargs) 2413s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2413s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2413s E Is the server running on that host and accepting TCP/IP connections? 2413s 2413s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2413s 2413s The above exception was the direct cause of the following exception: 2413s 2413s conn = 'postgresql_psycopg2_conn' 2413s request = > 2413s test_frame1 = index A B C D 2413s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2413s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2413s 2413s @pytest.mark.parametrize("conn", sqlalchemy_connectable) 2413s def test_options_sqlalchemy(conn, request, test_frame1): 2413s # use the set option 2413s > conn = request.getfixturevalue(conn) 2413s 2413s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3504: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def getfixturevalue(self, argname: str) -> Any: 2413s """Dynamically run a named fixture function. 2413s 2413s Declaring fixtures via function argument is recommended where possible. 2413s But if you can only decide whether to use another fixture at test 2413s setup time, you may use this function to retrieve it inside a fixture 2413s or test function body. 2413s 2413s This method can be used during the test setup phase or the test run 2413s phase, but during the test teardown phase a fixture's value may not 2413s be available. 2413s 2413s :param argname: 2413s The fixture name. 2413s :raises pytest.FixtureLookupError: 2413s If the given fixture could not be found. 2413s """ 2413s # Note that in addition to the use case described in the docstring, 2413s # getfixturevalue() is also called by pytest itself during item and fixture 2413s # setup to evaluate the fixtures that are requested statically 2413s # (using function parameters, autouse, etc). 2413s 2413s > fixturedef = self._get_active_fixturedef(argname) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = > 2413s argname = 'postgresql_psycopg2_conn' 2413s 2413s def _get_active_fixturedef( 2413s self, argname: str 2413s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2413s if argname == "request": 2413s cached_result = (self, [0], None) 2413s return PseudoFixtureDef(cached_result, Scope.Function) 2413s 2413s # If we already finished computing a fixture by this name in this item, 2413s # return it. 2413s fixturedef = self._fixture_defs.get(argname) 2413s if fixturedef is not None: 2413s self._check_scope(fixturedef, fixturedef._scope) 2413s return fixturedef 2413s 2413s # Find the appropriate fixturedef. 2413s fixturedefs = self._arg2fixturedefs.get(argname, None) 2413s if fixturedefs is None: 2413s # We arrive here because of a dynamic call to 2413s # getfixturevalue(argname) which was naturally 2413s # not known at parsing/collection time. 2413s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2413s if fixturedefs is not None: 2413s self._arg2fixturedefs[argname] = fixturedefs 2413s # No fixtures defined with this name. 2413s if fixturedefs is None: 2413s raise FixtureLookupError(argname, self) 2413s # The are no fixtures with this name applicable for the function. 2413s if not fixturedefs: 2413s raise FixtureLookupError(argname, self) 2413s # A fixture may override another fixture with the same name, e.g. a 2413s # fixture in a module can override a fixture in a conftest, a fixture in 2413s # a class can override a fixture in the module, and so on. 2413s # An overriding fixture can request its own name (possibly indirectly); 2413s # in this case it gets the value of the fixture it overrides, one level 2413s # up. 2413s # Check how many `argname`s deep we are, and take the next one. 2413s # `fixturedefs` is sorted from furthest to closest, so use negative 2413s # indexing to go in reverse. 2413s index = -1 2413s for request in self._iter_chain(): 2413s if request.fixturename == argname: 2413s index -= 1 2413s # If already consumed all of the available levels, fail. 2413s if -index > len(fixturedefs): 2413s raise FixtureLookupError(argname, self) 2413s fixturedef = fixturedefs[index] 2413s 2413s # Prepare a SubRequest object for calling the fixture. 2413s try: 2413s callspec = self._pyfuncitem.callspec 2413s except AttributeError: 2413s callspec = None 2413s if callspec is not None and argname in callspec.params: 2413s param = callspec.params[argname] 2413s param_index = callspec.indices[argname] 2413s # The parametrize invocation scope overrides the fixture's scope. 2413s scope = callspec._arg2scope[argname] 2413s else: 2413s param = NOTSET 2413s param_index = 0 2413s scope = fixturedef._scope 2413s self._check_fixturedef_without_param(fixturedef) 2413s self._check_scope(fixturedef, scope) 2413s subrequest = SubRequest( 2413s self, scope, param, param_index, fixturedef, _ispytest=True 2413s ) 2413s 2413s # Make sure the fixture value is cached, running it if it isn't 2413s > fixturedef.execute(request=subrequest) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s request = > 2413s 2413s def execute(self, request: SubRequest) -> FixtureValue: 2413s """Return the value of this fixture, executing it if not cached.""" 2413s # Ensure that the dependent fixtures requested by this fixture are loaded. 2413s # This needs to be done before checking if we have a cached value, since 2413s # if a dependent fixture has their cache invalidated, e.g. due to 2413s # parametrization, they finalize themselves and fixtures depending on it 2413s # (which will likely include this fixture) setting `self.cached_result = None`. 2413s # See #4871 2413s requested_fixtures_that_should_finalize_us = [] 2413s for argname in self.argnames: 2413s fixturedef = request._get_active_fixturedef(argname) 2413s # Saves requested fixtures in a list so we later can add our finalizer 2413s # to them, ensuring that if a requested fixture gets torn down we get torn 2413s # down first. This is generally handled by SetupState, but still currently 2413s # needed when this fixture is not parametrized but depends on a parametrized 2413s # fixture. 2413s if not isinstance(fixturedef, PseudoFixtureDef): 2413s requested_fixtures_that_should_finalize_us.append(fixturedef) 2413s 2413s # Check for (and return) cached value/exception. 2413s if self.cached_result is not None: 2413s request_cache_key = self.cache_key(request) 2413s cache_key = self.cached_result[1] 2413s try: 2413s # Attempt to make a normal == check: this might fail for objects 2413s # which do not implement the standard comparison (like numpy arrays -- #6497). 2413s cache_hit = bool(request_cache_key == cache_key) 2413s except (ValueError, RuntimeError): 2413s # If the comparison raises, use 'is' as fallback. 2413s cache_hit = request_cache_key is cache_key 2413s 2413s if cache_hit: 2413s if self.cached_result[2] is not None: 2413s exc, exc_tb = self.cached_result[2] 2413s raise exc.with_traceback(exc_tb) 2413s else: 2413s result = self.cached_result[0] 2413s return result 2413s # We have a previous but differently parametrized fixture instance 2413s # so we need to tear it down before creating a new one. 2413s self.finish(request) 2413s assert self.cached_result is None 2413s 2413s # Add finalizer to requested fixtures we saved previously. 2413s # We make sure to do this after checking for cached value to avoid 2413s # adding our finalizer multiple times. (#12135) 2413s finalizer = functools.partial(self.finish, request=request) 2413s for parent_fixture in requested_fixtures_that_should_finalize_us: 2413s parent_fixture.addfinalizer(finalizer) 2413s 2413s ihook = request.node.ihook 2413s try: 2413s # Setup the fixture, run the code in it, and cache the value 2413s # in self.cached_result 2413s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2413s 2413s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2413s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2413s 2413s self = 2413s kwargs = {'fixturedef': , 'request': >} 2413s firstresult = True 2413s 2413s def __call__(self, **kwargs: object) -> Any: 2413s """Call the hook. 2413s 2413s Only accepts keyword arguments, which should match the hook 2413s specification. 2413s 2413s Returns the result(s) of calling all registered plugins, see 2413s :ref:`calling`. 2413s """ 2413s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s ___________________ test_options_auto[mysql_pymysql_engine] ____________________ 2414s conn = 'mysql_pymysql_engine' 2414s request = > 2414s test_frame1 = index A B C D 2414s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2414s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s def test_options_auto(conn, request, test_frame1): 2414s # use the set option 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3519: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d3f50>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d3f50> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s ____________________ test_options_auto[mysql_pymysql_conn] _____________________ 2414s conn = 'mysql_pymysql_conn' 2414s request = > 2414s test_frame1 = index A B C D 2414s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2414s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s def test_options_auto(conn, request, test_frame1): 2414s # use the set option 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3519: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s > fixturedef = request._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d3fb0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5d3fb0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s ________________ test_options_auto[postgresql_psycopg2_engine] _________________ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s request = > 2414s test_frame1 = index A B C D 2414s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2414s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s def test_options_auto(conn, request, test_frame1): 2414s # use the set option 2414s conn = request.getfixturevalue(conn) 2414s with pd.option_context("io.sql.engine", "auto"): 2414s > with pandasSQL_builder(conn) as pandasSQL: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3521: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = False 2414s 2414s def pandasSQL_builder( 2414s con, 2414s schema: str | None = None, 2414s need_transaction: bool = False, 2414s ) -> PandasSQL: 2414s """ 2414s Convenience function to return the correct PandasSQL subclass based on the 2414s provided parameters. Also creates a sqlalchemy connection and transaction 2414s if necessary. 2414s """ 2414s import sqlite3 2414s 2414s if isinstance(con, sqlite3.Connection) or con is None: 2414s return SQLiteDatabase(con) 2414s 2414s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2414s 2414s if isinstance(con, str) and sqlalchemy is None: 2414s raise ImportError("Using URI string without sqlalchemy installed.") 2414s 2414s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2414s > return SQLDatabase(con, schema, need_transaction) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = False 2414s 2414s def __init__( 2414s self, con, schema: str | None = None, need_transaction: bool = False 2414s ) -> None: 2414s from sqlalchemy import create_engine 2414s from sqlalchemy.engine import Engine 2414s from sqlalchemy.schema import MetaData 2414s 2414s # self.exit_stack cleans up the Engine and Connection and commits the 2414s # transaction if any of those objects was created below. 2414s # Cleanup happens either in self.__exit__ or at the end of the iterator 2414s # returned by read_sql when chunksize is not None. 2414s self.exit_stack = ExitStack() 2414s if isinstance(con, str): 2414s con = create_engine(con) 2414s self.exit_stack.callback(con.dispose) 2414s if isinstance(con, Engine): 2414s > con = self.exit_stack.enter_context(con.connect()) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _________________ test_options_auto[postgresql_psycopg2_conn] __________________ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = 'postgresql_psycopg2_conn' 2414s request = > 2414s test_frame1 = index A B C D 2414s 0 2000-01-03 00:00:00 0.980269 3.685731 -0.364217 -1...0-01-05 00:00:00 0.498581 0.731168 -0.537677 1.346270 2414s 3 2000-01-06 00:00:00 1.120202 1.567621 0.003641 0.675253 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s def test_options_auto(conn, request, test_frame1): 2414s # use the set option 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3519: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_dtype_backend[python-numpy_nullable-read_sql-mysql_pymysql_engine] _ 2414s conn = 'mysql_pymysql_engine' 2414s request = > 2414s string_storage = 'python', func = 'read_sql', dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9eb4977e0> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype_backend( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s # GH#50048 2414s conn_name = conn 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3564: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5c0f50>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5c0f50> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_dtype_backend[python-numpy_nullable-read_sql-mysql_pymysql_conn] _ 2414s conn = 'mysql_pymysql_conn' 2414s request = > 2414s string_storage = 'python', func = 'read_sql', dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9eb4977e0> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype_backend( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s # GH#50048 2414s conn_name = conn 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3564: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s > fixturedef = request._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5c1070>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5c1070> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_dtype_backend[python-numpy_nullable-read_sql-postgresql_psycopg2_engine] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s request = > 2414s string_storage = 'python', func = 'read_sql', dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9eb4977e0> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype_backend( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s # GH#50048 2414s conn_name = conn 2414s conn = request.getfixturevalue(conn) 2414s table = "test" 2414s df = dtype_backend_data 2414s > df.to_sql(name=table, con=conn, index=False, if_exists="replace") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3567: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ( a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None,) 2414s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'replace', 'index': False, 'name': 'test'} 2414s 2414s @wraps(func) 2414s def wrapper(*args, **kwargs): 2414s if len(args) > num_allow_args: 2414s warnings.warn( 2414s msg.format(arguments=_format_argument_list(allow_args)), 2414s FutureWarning, 2414s stacklevel=find_stack_level(), 2414s ) 2414s > return func(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None 2414s 2414s @final 2414s @deprecate_nonkeyword_arguments( 2414s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2414s ) 2414s def to_sql( 2414s self, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool_t = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2414s newly created, appended to, or overwritten. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s Name of SQL table. 2414s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. Legacy support is provided for sqlite3.Connection objects. The user 2414s is responsible for engine disposal and connection closure for the SQLAlchemy 2414s connectable. See `here \ 2414s `_. 2414s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2414s the transaction will not be committed. If passing a sqlite3.Connection, 2414s it will not be possible to roll back the record insertion. 2414s 2414s schema : str, optional 2414s Specify the schema (if database flavor supports this). If None, use 2414s default schema. 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s How to behave if the table already exists. 2414s 2414s * fail: Raise a ValueError. 2414s * replace: Drop the table before inserting new values. 2414s * append: Insert new values to the existing table. 2414s 2414s index : bool, default True 2414s Write DataFrame index as a column. Uses `index_label` as the column 2414s name in the table. Creates a table index for this column. 2414s index_label : str or sequence, default None 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s * None : Uses standard SQL ``INSERT`` clause (one per row). 2414s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2414s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s The number of returned rows affected is the sum of the ``rowcount`` 2414s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2414s reflect the exact number of written rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Raises 2414s ------ 2414s ValueError 2414s When the table already exists and `if_exists` is 'fail' (the 2414s default). 2414s 2414s See Also 2414s -------- 2414s read_sql : Read a DataFrame from a table. 2414s 2414s Notes 2414s ----- 2414s Timezone aware datetime columns will be written as 2414s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2414s database. Otherwise, the datetimes will be stored as timezone unaware 2414s timestamps local to the original timezone. 2414s 2414s Not all datastores support ``method="multi"``. Oracle, for example, 2414s does not support multi-value insert. 2414s 2414s References 2414s ---------- 2414s .. [1] https://docs.sqlalchemy.org 2414s .. [2] https://www.python.org/dev/peps/pep-0249/ 2414s 2414s Examples 2414s -------- 2414s Create an in-memory SQLite database. 2414s 2414s >>> from sqlalchemy import create_engine 2414s >>> engine = create_engine('sqlite://', echo=False) 2414s 2414s Create a table from scratch with 3 rows. 2414s 2414s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2414s >>> df 2414s name 2414s 0 User 1 2414s 1 User 2 2414s 2 User 3 2414s 2414s >>> df.to_sql(name='users', con=engine) 2414s 3 2414s >>> from sqlalchemy import text 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2414s 2414s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2414s 2414s >>> with engine.begin() as connection: 2414s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2414s ... df1.to_sql(name='users', con=connection, if_exists='append') 2414s 2 2414s 2414s This is allowed to support operations that require that the same 2414s DBAPI connection is used for the entire operation. 2414s 2414s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2414s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2414s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2414s (1, 'User 7')] 2414s 2414s Overwrite the table with just ``df2``. 2414s 2414s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2414s ... index_label='id') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 6'), (1, 'User 7')] 2414s 2414s Use ``method`` to define a callable insertion method to do nothing 2414s if there's a primary key conflict on a table in a PostgreSQL database. 2414s 2414s >>> from sqlalchemy.dialects.postgresql import insert 2414s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2414s ... # "a" is the primary key in "conflict_table" 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2414s 0 2414s 2414s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2414s on a primary key. 2414s 2414s >>> from sqlalchemy.dialects.mysql import insert 2414s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2414s ... # update columns "b" and "c" on primary key conflict 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = ( 2414s ... insert(table.table) 2414s ... .values(data) 2414s ... ) 2414s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2414s 2 2414s 2414s Specify the dtype (especially useful for integers with missing values). 2414s Notice that while pandas is forced to store the data as floating point, 2414s the database supports nullable integers. When fetching the data with 2414s Python, we get back integer scalars. 2414s 2414s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2414s >>> df 2414s A 2414s 0 1.0 2414s 1 NaN 2414s 2 2.0 2414s 2414s >>> from sqlalchemy.types import Integer 2414s >>> df.to_sql(name='integers', con=engine, index=False, 2414s ... dtype={"A": Integer()}) 2414s 3 2414s 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2414s [(1,), (None,), (2,)] 2414s """ # noqa: E501 2414s from pandas.io import sql 2414s 2414s > return sql.to_sql( 2414s self, 2414s name, 2414s con, 2414s schema=schema, 2414s if_exists=if_exists, 2414s index=index, 2414s index_label=index_label, 2414s chunksize=chunksize, 2414s dtype=dtype, 2414s method=method, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s frame = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None, engine = 'auto' 2414s engine_kwargs = {} 2414s 2414s def to_sql( 2414s frame, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s engine: str = "auto", 2414s **engine_kwargs, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Parameters 2414s ---------- 2414s frame : DataFrame, Series 2414s name : str 2414s Name of SQL table. 2414s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2414s or sqlite3 DBAPI2 connection 2414s ADBC provides high performance I/O with native type support, where available. 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. 2414s If a DBAPI2 object, only sqlite3 is supported. 2414s schema : str, optional 2414s Name of SQL schema in database to write to (if database flavor 2414s supports this). If None, use default schema (default). 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s - fail: If table exists, do nothing. 2414s - replace: If table exists, drop it, recreate it, and insert data. 2414s - append: If table exists, insert data. Create if does not exist. 2414s index : bool, default True 2414s Write DataFrame index as a column. 2414s index_label : str or sequence, optional 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s - None : Uses standard SQL ``INSERT`` clause (one per row). 2414s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2414s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s engine : {'auto', 'sqlalchemy'}, default 'auto' 2414s SQL engine library to use. If 'auto', then the option 2414s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2414s behavior is 'sqlalchemy' 2414s 2414s .. versionadded:: 1.3.0 2414s 2414s **engine_kwargs 2414s Any additional kwargs are passed to the engine. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Notes 2414s ----- 2414s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2414s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2414s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2414s rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__ 2414s """ # noqa: E501 2414s if if_exists not in ("fail", "replace", "append"): 2414s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2414s 2414s if isinstance(frame, Series): 2414s frame = frame.to_frame() 2414s elif not isinstance(frame, DataFrame): 2414s raise NotImplementedError( 2414s "'frame' argument should be either a Series or a DataFrame" 2414s ) 2414s 2414s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def pandasSQL_builder( 2414s con, 2414s schema: str | None = None, 2414s need_transaction: bool = False, 2414s ) -> PandasSQL: 2414s """ 2414s Convenience function to return the correct PandasSQL subclass based on the 2414s provided parameters. Also creates a sqlalchemy connection and transaction 2414s if necessary. 2414s """ 2414s import sqlite3 2414s 2414s if isinstance(con, sqlite3.Connection) or con is None: 2414s return SQLiteDatabase(con) 2414s 2414s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2414s 2414s if isinstance(con, str) and sqlalchemy is None: 2414s raise ImportError("Using URI string without sqlalchemy installed.") 2414s 2414s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2414s > return SQLDatabase(con, schema, need_transaction) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def __init__( 2414s self, con, schema: str | None = None, need_transaction: bool = False 2414s ) -> None: 2414s from sqlalchemy import create_engine 2414s from sqlalchemy.engine import Engine 2414s from sqlalchemy.schema import MetaData 2414s 2414s # self.exit_stack cleans up the Engine and Connection and commits the 2414s # transaction if any of those objects was created below. 2414s # Cleanup happens either in self.__exit__ or at the end of the iterator 2414s # returned by read_sql when chunksize is not None. 2414s self.exit_stack = ExitStack() 2414s if isinstance(con, str): 2414s con = create_engine(con) 2414s self.exit_stack.callback(con.dispose) 2414s if isinstance(con, Engine): 2414s > con = self.exit_stack.enter_context(con.connect()) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_dtype_backend[python-numpy_nullable-read_sql-postgresql_psycopg2_conn] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = 'postgresql_psycopg2_conn' 2414s request = > 2414s string_storage = 'python', func = 'read_sql', dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9eb4977e0> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype_backend( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s # GH#50048 2414s conn_name = conn 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3564: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_dtype_backend[python-numpy_nullable-read_sql_query-mysql_pymysql_engine] _ 2414s conn = 'mysql_pymysql_engine' 2414s request = > 2414s string_storage = 'python', func = 'read_sql_query' 2414s dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9eb497920> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype_backend( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s # GH#50048 2414s conn_name = conn 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3564: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5c1a30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5c1a30> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_dtype_backend[python-numpy_nullable-read_sql_query-mysql_pymysql_conn] _ 2414s conn = 'mysql_pymysql_conn' 2414s request = > 2414s string_storage = 'python', func = 'read_sql_query' 2414s dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9eb497920> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype_backend( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s # GH#50048 2414s conn_name = conn 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3564: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s > fixturedef = request._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5c1af0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea5c1af0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_dtype_backend[python-numpy_nullable-read_sql_query-postgresql_psycopg2_engine] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s request = > 2414s string_storage = 'python', func = 'read_sql_query' 2414s dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9eb4979c0> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype_backend( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s # GH#50048 2414s conn_name = conn 2414s conn = request.getfixturevalue(conn) 2414s table = "test" 2414s df = dtype_backend_data 2414s > df.to_sql(name=table, con=conn, index=False, if_exists="replace") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3567: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ( a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None,) 2414s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'replace', 'index': False, 'name': 'test'} 2414s 2414s @wraps(func) 2414s def wrapper(*args, **kwargs): 2414s if len(args) > num_allow_args: 2414s warnings.warn( 2414s msg.format(arguments=_format_argument_list(allow_args)), 2414s FutureWarning, 2414s stacklevel=find_stack_level(), 2414s ) 2414s > return func(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None 2414s 2414s @final 2414s @deprecate_nonkeyword_arguments( 2414s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2414s ) 2414s def to_sql( 2414s self, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool_t = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2414s newly created, appended to, or overwritten. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s Name of SQL table. 2414s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. Legacy support is provided for sqlite3.Connection objects. The user 2414s is responsible for engine disposal and connection closure for the SQLAlchemy 2414s connectable. See `here \ 2414s `_. 2414s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2414s the transaction will not be committed. If passing a sqlite3.Connection, 2414s it will not be possible to roll back the record insertion. 2414s 2414s schema : str, optional 2414s Specify the schema (if database flavor supports this). If None, use 2414s default schema. 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s How to behave if the table already exists. 2414s 2414s * fail: Raise a ValueError. 2414s * replace: Drop the table before inserting new values. 2414s * append: Insert new values to the existing table. 2414s 2414s index : bool, default True 2414s Write DataFrame index as a column. Uses `index_label` as the column 2414s name in the table. Creates a table index for this column. 2414s index_label : str or sequence, default None 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s * None : Uses standard SQL ``INSERT`` clause (one per row). 2414s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2414s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s The number of returned rows affected is the sum of the ``rowcount`` 2414s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2414s reflect the exact number of written rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Raises 2414s ------ 2414s ValueError 2414s When the table already exists and `if_exists` is 'fail' (the 2414s default). 2414s 2414s See Also 2414s -------- 2414s read_sql : Read a DataFrame from a table. 2414s 2414s Notes 2414s ----- 2414s Timezone aware datetime columns will be written as 2414s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2414s database. Otherwise, the datetimes will be stored as timezone unaware 2414s timestamps local to the original timezone. 2414s 2414s Not all datastores support ``method="multi"``. Oracle, for example, 2414s does not support multi-value insert. 2414s 2414s References 2414s ---------- 2414s .. [1] https://docs.sqlalchemy.org 2414s .. [2] https://www.python.org/dev/peps/pep-0249/ 2414s 2414s Examples 2414s -------- 2414s Create an in-memory SQLite database. 2414s 2414s >>> from sqlalchemy import create_engine 2414s >>> engine = create_engine('sqlite://', echo=False) 2414s 2414s Create a table from scratch with 3 rows. 2414s 2414s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2414s >>> df 2414s name 2414s 0 User 1 2414s 1 User 2 2414s 2 User 3 2414s 2414s >>> df.to_sql(name='users', con=engine) 2414s 3 2414s >>> from sqlalchemy import text 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2414s 2414s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2414s 2414s >>> with engine.begin() as connection: 2414s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2414s ... df1.to_sql(name='users', con=connection, if_exists='append') 2414s 2 2414s 2414s This is allowed to support operations that require that the same 2414s DBAPI connection is used for the entire operation. 2414s 2414s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2414s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2414s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2414s (1, 'User 7')] 2414s 2414s Overwrite the table with just ``df2``. 2414s 2414s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2414s ... index_label='id') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 6'), (1, 'User 7')] 2414s 2414s Use ``method`` to define a callable insertion method to do nothing 2414s if there's a primary key conflict on a table in a PostgreSQL database. 2414s 2414s >>> from sqlalchemy.dialects.postgresql import insert 2414s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2414s ... # "a" is the primary key in "conflict_table" 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2414s 0 2414s 2414s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2414s on a primary key. 2414s 2414s >>> from sqlalchemy.dialects.mysql import insert 2414s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2414s ... # update columns "b" and "c" on primary key conflict 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = ( 2414s ... insert(table.table) 2414s ... .values(data) 2414s ... ) 2414s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2414s 2 2414s 2414s Specify the dtype (especially useful for integers with missing values). 2414s Notice that while pandas is forced to store the data as floating point, 2414s the database supports nullable integers. When fetching the data with 2414s Python, we get back integer scalars. 2414s 2414s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2414s >>> df 2414s A 2414s 0 1.0 2414s 1 NaN 2414s 2 2.0 2414s 2414s >>> from sqlalchemy.types import Integer 2414s >>> df.to_sql(name='integers', con=engine, index=False, 2414s ... dtype={"A": Integer()}) 2414s 3 2414s 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2414s [(1,), (None,), (2,)] 2414s """ # noqa: E501 2414s from pandas.io import sql 2414s 2414s > return sql.to_sql( 2414s self, 2414s name, 2414s con, 2414s schema=schema, 2414s if_exists=if_exists, 2414s index=index, 2414s index_label=index_label, 2414s chunksize=chunksize, 2414s dtype=dtype, 2414s method=method, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s frame = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None, engine = 'auto' 2414s engine_kwargs = {} 2414s 2414s def to_sql( 2414s frame, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s engine: str = "auto", 2414s **engine_kwargs, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Parameters 2414s ---------- 2414s frame : DataFrame, Series 2414s name : str 2414s Name of SQL table. 2414s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2414s or sqlite3 DBAPI2 connection 2414s ADBC provides high performance I/O with native type support, where available. 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. 2414s If a DBAPI2 object, only sqlite3 is supported. 2414s schema : str, optional 2414s Name of SQL schema in database to write to (if database flavor 2414s supports this). If None, use default schema (default). 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s - fail: If table exists, do nothing. 2414s - replace: If table exists, drop it, recreate it, and insert data. 2414s - append: If table exists, insert data. Create if does not exist. 2414s index : bool, default True 2414s Write DataFrame index as a column. 2414s index_label : str or sequence, optional 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s - None : Uses standard SQL ``INSERT`` clause (one per row). 2414s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2414s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s engine : {'auto', 'sqlalchemy'}, default 'auto' 2414s SQL engine library to use. If 'auto', then the option 2414s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2414s behavior is 'sqlalchemy' 2414s 2414s .. versionadded:: 1.3.0 2414s 2414s **engine_kwargs 2414s Any additional kwargs are passed to the engine. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Notes 2414s ----- 2414s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2414s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2414s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2414s rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__ 2414s """ # noqa: E501 2414s if if_exists not in ("fail", "replace", "append"): 2414s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2414s 2414s if isinstance(frame, Series): 2414s frame = frame.to_frame() 2414s elif not isinstance(frame, DataFrame): 2414s raise NotImplementedError( 2414s "'frame' argument should be either a Series or a DataFrame" 2414s ) 2414s 2414s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def pandasSQL_builder( 2414s con, 2414s schema: str | None = None, 2414s need_transaction: bool = False, 2414s ) -> PandasSQL: 2414s """ 2414s Convenience function to return the correct PandasSQL subclass based on the 2414s provided parameters. Also creates a sqlalchemy connection and transaction 2414s if necessary. 2414s """ 2414s import sqlite3 2414s 2414s if isinstance(con, sqlite3.Connection) or con is None: 2414s return SQLiteDatabase(con) 2414s 2414s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2414s 2414s if isinstance(con, str) and sqlalchemy is None: 2414s raise ImportError("Using URI string without sqlalchemy installed.") 2414s 2414s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2414s > return SQLDatabase(con, schema, need_transaction) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def __init__( 2414s self, con, schema: str | None = None, need_transaction: bool = False 2414s ) -> None: 2414s from sqlalchemy import create_engine 2414s from sqlalchemy.engine import Engine 2414s from sqlalchemy.schema import MetaData 2414s 2414s # self.exit_stack cleans up the Engine and Connection and commits the 2414s # transaction if any of those objects was created below. 2414s # Cleanup happens either in self.__exit__ or at the end of the iterator 2414s # returned by read_sql when chunksize is not None. 2414s self.exit_stack = ExitStack() 2414s if isinstance(con, str): 2414s con = create_engine(con) 2414s self.exit_stack.callback(con.dispose) 2414s if isinstance(con, Engine): 2414s > con = self.exit_stack.enter_context(con.connect()) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_dtype_backend[python-numpy_nullable-read_sql_query-postgresql_psycopg2_conn] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = 'postgresql_psycopg2_conn' 2414s request = > 2414s string_storage = 'python', func = 'read_sql_query' 2414s dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9eb497a60> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype_backend( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s # GH#50048 2414s conn_name = conn 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3564: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql-mysql_pymysql_engine] _ 2414s conn = 'mysql_pymysql_engine' 2414s request = > 2414s string_storage = 'python', func = 'read_sql', dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9eb497ce0> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table"]) 2414s def test_read_sql_dtype_backend_table( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s if "sqlite" in conn and "adbc" not in conn: 2414s request.applymarker( 2414s pytest.mark.xfail( 2414s reason=( 2414s "SQLite actually returns proper boolean values via " 2414s "read_sql_table, but before pytest refactor was skipped" 2414s ) 2414s ) 2414s ) 2414s # GH#50048 2414s conn_name = conn 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3616: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea612150>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea612150> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql-mysql_pymysql_conn] _ 2414s conn = 'mysql_pymysql_conn' 2414s request = > 2414s string_storage = 'python', func = 'read_sql', dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9eb497ce0> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table"]) 2414s def test_read_sql_dtype_backend_table( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s if "sqlite" in conn and "adbc" not in conn: 2414s request.applymarker( 2414s pytest.mark.xfail( 2414s reason=( 2414s "SQLite actually returns proper boolean values via " 2414s "read_sql_table, but before pytest refactor was skipped" 2414s ) 2414s ) 2414s ) 2414s # GH#50048 2414s conn_name = conn 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3616: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s > fixturedef = request._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea612270>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea612270> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql-postgresql_psycopg2_engine] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s request = > 2414s string_storage = 'python', func = 'read_sql', dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9eb497d80> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table"]) 2414s def test_read_sql_dtype_backend_table( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s if "sqlite" in conn and "adbc" not in conn: 2414s request.applymarker( 2414s pytest.mark.xfail( 2414s reason=( 2414s "SQLite actually returns proper boolean values via " 2414s "read_sql_table, but before pytest refactor was skipped" 2414s ) 2414s ) 2414s ) 2414s # GH#50048 2414s conn_name = conn 2414s conn = request.getfixturevalue(conn) 2414s table = "test" 2414s df = dtype_backend_data 2414s > df.to_sql(name=table, con=conn, index=False, if_exists="replace") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3619: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ( a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None,) 2414s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'replace', 'index': False, 'name': 'test'} 2414s 2414s @wraps(func) 2414s def wrapper(*args, **kwargs): 2414s if len(args) > num_allow_args: 2414s warnings.warn( 2414s msg.format(arguments=_format_argument_list(allow_args)), 2414s FutureWarning, 2414s stacklevel=find_stack_level(), 2414s ) 2414s > return func(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None 2414s 2414s @final 2414s @deprecate_nonkeyword_arguments( 2414s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2414s ) 2414s def to_sql( 2414s self, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool_t = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2414s newly created, appended to, or overwritten. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s Name of SQL table. 2414s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. Legacy support is provided for sqlite3.Connection objects. The user 2414s is responsible for engine disposal and connection closure for the SQLAlchemy 2414s connectable. See `here \ 2414s `_. 2414s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2414s the transaction will not be committed. If passing a sqlite3.Connection, 2414s it will not be possible to roll back the record insertion. 2414s 2414s schema : str, optional 2414s Specify the schema (if database flavor supports this). If None, use 2414s default schema. 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s How to behave if the table already exists. 2414s 2414s * fail: Raise a ValueError. 2414s * replace: Drop the table before inserting new values. 2414s * append: Insert new values to the existing table. 2414s 2414s index : bool, default True 2414s Write DataFrame index as a column. Uses `index_label` as the column 2414s name in the table. Creates a table index for this column. 2414s index_label : str or sequence, default None 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s * None : Uses standard SQL ``INSERT`` clause (one per row). 2414s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2414s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s The number of returned rows affected is the sum of the ``rowcount`` 2414s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2414s reflect the exact number of written rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Raises 2414s ------ 2414s ValueError 2414s When the table already exists and `if_exists` is 'fail' (the 2414s default). 2414s 2414s See Also 2414s -------- 2414s read_sql : Read a DataFrame from a table. 2414s 2414s Notes 2414s ----- 2414s Timezone aware datetime columns will be written as 2414s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2414s database. Otherwise, the datetimes will be stored as timezone unaware 2414s timestamps local to the original timezone. 2414s 2414s Not all datastores support ``method="multi"``. Oracle, for example, 2414s does not support multi-value insert. 2414s 2414s References 2414s ---------- 2414s .. [1] https://docs.sqlalchemy.org 2414s .. [2] https://www.python.org/dev/peps/pep-0249/ 2414s 2414s Examples 2414s -------- 2414s Create an in-memory SQLite database. 2414s 2414s >>> from sqlalchemy import create_engine 2414s >>> engine = create_engine('sqlite://', echo=False) 2414s 2414s Create a table from scratch with 3 rows. 2414s 2414s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2414s >>> df 2414s name 2414s 0 User 1 2414s 1 User 2 2414s 2 User 3 2414s 2414s >>> df.to_sql(name='users', con=engine) 2414s 3 2414s >>> from sqlalchemy import text 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2414s 2414s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2414s 2414s >>> with engine.begin() as connection: 2414s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2414s ... df1.to_sql(name='users', con=connection, if_exists='append') 2414s 2 2414s 2414s This is allowed to support operations that require that the same 2414s DBAPI connection is used for the entire operation. 2414s 2414s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2414s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2414s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2414s (1, 'User 7')] 2414s 2414s Overwrite the table with just ``df2``. 2414s 2414s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2414s ... index_label='id') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 6'), (1, 'User 7')] 2414s 2414s Use ``method`` to define a callable insertion method to do nothing 2414s if there's a primary key conflict on a table in a PostgreSQL database. 2414s 2414s >>> from sqlalchemy.dialects.postgresql import insert 2414s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2414s ... # "a" is the primary key in "conflict_table" 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2414s 0 2414s 2414s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2414s on a primary key. 2414s 2414s >>> from sqlalchemy.dialects.mysql import insert 2414s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2414s ... # update columns "b" and "c" on primary key conflict 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = ( 2414s ... insert(table.table) 2414s ... .values(data) 2414s ... ) 2414s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2414s 2 2414s 2414s Specify the dtype (especially useful for integers with missing values). 2414s Notice that while pandas is forced to store the data as floating point, 2414s the database supports nullable integers. When fetching the data with 2414s Python, we get back integer scalars. 2414s 2414s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2414s >>> df 2414s A 2414s 0 1.0 2414s 1 NaN 2414s 2 2.0 2414s 2414s >>> from sqlalchemy.types import Integer 2414s >>> df.to_sql(name='integers', con=engine, index=False, 2414s ... dtype={"A": Integer()}) 2414s 3 2414s 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2414s [(1,), (None,), (2,)] 2414s """ # noqa: E501 2414s from pandas.io import sql 2414s 2414s > return sql.to_sql( 2414s self, 2414s name, 2414s con, 2414s schema=schema, 2414s if_exists=if_exists, 2414s index=index, 2414s index_label=index_label, 2414s chunksize=chunksize, 2414s dtype=dtype, 2414s method=method, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s frame = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None, engine = 'auto' 2414s engine_kwargs = {} 2414s 2414s def to_sql( 2414s frame, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s engine: str = "auto", 2414s **engine_kwargs, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Parameters 2414s ---------- 2414s frame : DataFrame, Series 2414s name : str 2414s Name of SQL table. 2414s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2414s or sqlite3 DBAPI2 connection 2414s ADBC provides high performance I/O with native type support, where available. 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. 2414s If a DBAPI2 object, only sqlite3 is supported. 2414s schema : str, optional 2414s Name of SQL schema in database to write to (if database flavor 2414s supports this). If None, use default schema (default). 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s - fail: If table exists, do nothing. 2414s - replace: If table exists, drop it, recreate it, and insert data. 2414s - append: If table exists, insert data. Create if does not exist. 2414s index : bool, default True 2414s Write DataFrame index as a column. 2414s index_label : str or sequence, optional 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s - None : Uses standard SQL ``INSERT`` clause (one per row). 2414s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2414s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s engine : {'auto', 'sqlalchemy'}, default 'auto' 2414s SQL engine library to use. If 'auto', then the option 2414s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2414s behavior is 'sqlalchemy' 2414s 2414s .. versionadded:: 1.3.0 2414s 2414s **engine_kwargs 2414s Any additional kwargs are passed to the engine. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Notes 2414s ----- 2414s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2414s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2414s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2414s rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__ 2414s """ # noqa: E501 2414s if if_exists not in ("fail", "replace", "append"): 2414s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2414s 2414s if isinstance(frame, Series): 2414s frame = frame.to_frame() 2414s elif not isinstance(frame, DataFrame): 2414s raise NotImplementedError( 2414s "'frame' argument should be either a Series or a DataFrame" 2414s ) 2414s 2414s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def pandasSQL_builder( 2414s con, 2414s schema: str | None = None, 2414s need_transaction: bool = False, 2414s ) -> PandasSQL: 2414s """ 2414s Convenience function to return the correct PandasSQL subclass based on the 2414s provided parameters. Also creates a sqlalchemy connection and transaction 2414s if necessary. 2414s """ 2414s import sqlite3 2414s 2414s if isinstance(con, sqlite3.Connection) or con is None: 2414s return SQLiteDatabase(con) 2414s 2414s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2414s 2414s if isinstance(con, str) and sqlalchemy is None: 2414s raise ImportError("Using URI string without sqlalchemy installed.") 2414s 2414s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2414s > return SQLDatabase(con, schema, need_transaction) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def __init__( 2414s self, con, schema: str | None = None, need_transaction: bool = False 2414s ) -> None: 2414s from sqlalchemy import create_engine 2414s from sqlalchemy.engine import Engine 2414s from sqlalchemy.schema import MetaData 2414s 2414s # self.exit_stack cleans up the Engine and Connection and commits the 2414s # transaction if any of those objects was created below. 2414s # Cleanup happens either in self.__exit__ or at the end of the iterator 2414s # returned by read_sql when chunksize is not None. 2414s self.exit_stack = ExitStack() 2414s if isinstance(con, str): 2414s con = create_engine(con) 2414s self.exit_stack.callback(con.dispose) 2414s if isinstance(con, Engine): 2414s > con = self.exit_stack.enter_context(con.connect()) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql-postgresql_psycopg2_conn] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = 'postgresql_psycopg2_conn' 2414s request = > 2414s string_storage = 'python', func = 'read_sql', dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9eb497e20> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table"]) 2414s def test_read_sql_dtype_backend_table( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s if "sqlite" in conn and "adbc" not in conn: 2414s request.applymarker( 2414s pytest.mark.xfail( 2414s reason=( 2414s "SQLite actually returns proper boolean values via " 2414s "read_sql_table, but before pytest refactor was skipped" 2414s ) 2414s ) 2414s ) 2414s # GH#50048 2414s conn_name = conn 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3616: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql_table-mysql_pymysql_engine] _ 2414s conn = 'mysql_pymysql_engine' 2414s request = > 2414s string_storage = 'python', func = 'read_sql_table' 2414s dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9ea6402c0> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table"]) 2414s def test_read_sql_dtype_backend_table( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s if "sqlite" in conn and "adbc" not in conn: 2414s request.applymarker( 2414s pytest.mark.xfail( 2414s reason=( 2414s "SQLite actually returns proper boolean values via " 2414s "read_sql_table, but before pytest refactor was skipped" 2414s ) 2414s ) 2414s ) 2414s # GH#50048 2414s conn_name = conn 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3616: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea612c30>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea612c30> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql_table-mysql_pymysql_conn] _ 2414s conn = 'mysql_pymysql_conn' 2414s request = > 2414s string_storage = 'python', func = 'read_sql_table' 2414s dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9ea644360> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table"]) 2414s def test_read_sql_dtype_backend_table( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s if "sqlite" in conn and "adbc" not in conn: 2414s request.applymarker( 2414s pytest.mark.xfail( 2414s reason=( 2414s "SQLite actually returns proper boolean values via " 2414s "read_sql_table, but before pytest refactor was skipped" 2414s ) 2414s ) 2414s ) 2414s # GH#50048 2414s conn_name = conn 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3616: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s > fixturedef = request._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea612cf0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea612cf0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql_table-postgresql_psycopg2_engine] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s request = > 2414s string_storage = 'python', func = 'read_sql_table' 2414s dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9ea644400> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table"]) 2414s def test_read_sql_dtype_backend_table( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s if "sqlite" in conn and "adbc" not in conn: 2414s request.applymarker( 2414s pytest.mark.xfail( 2414s reason=( 2414s "SQLite actually returns proper boolean values via " 2414s "read_sql_table, but before pytest refactor was skipped" 2414s ) 2414s ) 2414s ) 2414s # GH#50048 2414s conn_name = conn 2414s conn = request.getfixturevalue(conn) 2414s table = "test" 2414s df = dtype_backend_data 2414s > df.to_sql(name=table, con=conn, index=False, if_exists="replace") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3619: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ( a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None,) 2414s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'replace', 'index': False, 'name': 'test'} 2414s 2414s @wraps(func) 2414s def wrapper(*args, **kwargs): 2414s if len(args) > num_allow_args: 2414s warnings.warn( 2414s msg.format(arguments=_format_argument_list(allow_args)), 2414s FutureWarning, 2414s stacklevel=find_stack_level(), 2414s ) 2414s > return func(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None 2414s 2414s @final 2414s @deprecate_nonkeyword_arguments( 2414s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2414s ) 2414s def to_sql( 2414s self, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool_t = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2414s newly created, appended to, or overwritten. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s Name of SQL table. 2414s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. Legacy support is provided for sqlite3.Connection objects. The user 2414s is responsible for engine disposal and connection closure for the SQLAlchemy 2414s connectable. See `here \ 2414s `_. 2414s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2414s the transaction will not be committed. If passing a sqlite3.Connection, 2414s it will not be possible to roll back the record insertion. 2414s 2414s schema : str, optional 2414s Specify the schema (if database flavor supports this). If None, use 2414s default schema. 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s How to behave if the table already exists. 2414s 2414s * fail: Raise a ValueError. 2414s * replace: Drop the table before inserting new values. 2414s * append: Insert new values to the existing table. 2414s 2414s index : bool, default True 2414s Write DataFrame index as a column. Uses `index_label` as the column 2414s name in the table. Creates a table index for this column. 2414s index_label : str or sequence, default None 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s * None : Uses standard SQL ``INSERT`` clause (one per row). 2414s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2414s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s The number of returned rows affected is the sum of the ``rowcount`` 2414s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2414s reflect the exact number of written rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Raises 2414s ------ 2414s ValueError 2414s When the table already exists and `if_exists` is 'fail' (the 2414s default). 2414s 2414s See Also 2414s -------- 2414s read_sql : Read a DataFrame from a table. 2414s 2414s Notes 2414s ----- 2414s Timezone aware datetime columns will be written as 2414s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2414s database. Otherwise, the datetimes will be stored as timezone unaware 2414s timestamps local to the original timezone. 2414s 2414s Not all datastores support ``method="multi"``. Oracle, for example, 2414s does not support multi-value insert. 2414s 2414s References 2414s ---------- 2414s .. [1] https://docs.sqlalchemy.org 2414s .. [2] https://www.python.org/dev/peps/pep-0249/ 2414s 2414s Examples 2414s -------- 2414s Create an in-memory SQLite database. 2414s 2414s >>> from sqlalchemy import create_engine 2414s >>> engine = create_engine('sqlite://', echo=False) 2414s 2414s Create a table from scratch with 3 rows. 2414s 2414s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2414s >>> df 2414s name 2414s 0 User 1 2414s 1 User 2 2414s 2 User 3 2414s 2414s >>> df.to_sql(name='users', con=engine) 2414s 3 2414s >>> from sqlalchemy import text 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2414s 2414s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2414s 2414s >>> with engine.begin() as connection: 2414s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2414s ... df1.to_sql(name='users', con=connection, if_exists='append') 2414s 2 2414s 2414s This is allowed to support operations that require that the same 2414s DBAPI connection is used for the entire operation. 2414s 2414s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2414s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2414s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2414s (1, 'User 7')] 2414s 2414s Overwrite the table with just ``df2``. 2414s 2414s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2414s ... index_label='id') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 6'), (1, 'User 7')] 2414s 2414s Use ``method`` to define a callable insertion method to do nothing 2414s if there's a primary key conflict on a table in a PostgreSQL database. 2414s 2414s >>> from sqlalchemy.dialects.postgresql import insert 2414s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2414s ... # "a" is the primary key in "conflict_table" 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2414s 0 2414s 2414s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2414s on a primary key. 2414s 2414s >>> from sqlalchemy.dialects.mysql import insert 2414s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2414s ... # update columns "b" and "c" on primary key conflict 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = ( 2414s ... insert(table.table) 2414s ... .values(data) 2414s ... ) 2414s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2414s 2 2414s 2414s Specify the dtype (especially useful for integers with missing values). 2414s Notice that while pandas is forced to store the data as floating point, 2414s the database supports nullable integers. When fetching the data with 2414s Python, we get back integer scalars. 2414s 2414s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2414s >>> df 2414s A 2414s 0 1.0 2414s 1 NaN 2414s 2 2.0 2414s 2414s >>> from sqlalchemy.types import Integer 2414s >>> df.to_sql(name='integers', con=engine, index=False, 2414s ... dtype={"A": Integer()}) 2414s 3 2414s 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2414s [(1,), (None,), (2,)] 2414s """ # noqa: E501 2414s from pandas.io import sql 2414s 2414s > return sql.to_sql( 2414s self, 2414s name, 2414s con, 2414s schema=schema, 2414s if_exists=if_exists, 2414s index=index, 2414s index_label=index_label, 2414s chunksize=chunksize, 2414s dtype=dtype, 2414s method=method, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s frame = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None, engine = 'auto' 2414s engine_kwargs = {} 2414s 2414s def to_sql( 2414s frame, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s engine: str = "auto", 2414s **engine_kwargs, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Parameters 2414s ---------- 2414s frame : DataFrame, Series 2414s name : str 2414s Name of SQL table. 2414s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2414s or sqlite3 DBAPI2 connection 2414s ADBC provides high performance I/O with native type support, where available. 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. 2414s If a DBAPI2 object, only sqlite3 is supported. 2414s schema : str, optional 2414s Name of SQL schema in database to write to (if database flavor 2414s supports this). If None, use default schema (default). 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s - fail: If table exists, do nothing. 2414s - replace: If table exists, drop it, recreate it, and insert data. 2414s - append: If table exists, insert data. Create if does not exist. 2414s index : bool, default True 2414s Write DataFrame index as a column. 2414s index_label : str or sequence, optional 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s - None : Uses standard SQL ``INSERT`` clause (one per row). 2414s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2414s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s engine : {'auto', 'sqlalchemy'}, default 'auto' 2414s SQL engine library to use. If 'auto', then the option 2414s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2414s behavior is 'sqlalchemy' 2414s 2414s .. versionadded:: 1.3.0 2414s 2414s **engine_kwargs 2414s Any additional kwargs are passed to the engine. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Notes 2414s ----- 2414s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2414s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2414s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2414s rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__ 2414s """ # noqa: E501 2414s if if_exists not in ("fail", "replace", "append"): 2414s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2414s 2414s if isinstance(frame, Series): 2414s frame = frame.to_frame() 2414s elif not isinstance(frame, DataFrame): 2414s raise NotImplementedError( 2414s "'frame' argument should be either a Series or a DataFrame" 2414s ) 2414s 2414s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def pandasSQL_builder( 2414s con, 2414s schema: str | None = None, 2414s need_transaction: bool = False, 2414s ) -> PandasSQL: 2414s """ 2414s Convenience function to return the correct PandasSQL subclass based on the 2414s provided parameters. Also creates a sqlalchemy connection and transaction 2414s if necessary. 2414s """ 2414s import sqlite3 2414s 2414s if isinstance(con, sqlite3.Connection) or con is None: 2414s return SQLiteDatabase(con) 2414s 2414s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2414s 2414s if isinstance(con, str) and sqlalchemy is None: 2414s raise ImportError("Using URI string without sqlalchemy installed.") 2414s 2414s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2414s > return SQLDatabase(con, schema, need_transaction) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def __init__( 2414s self, con, schema: str | None = None, need_transaction: bool = False 2414s ) -> None: 2414s from sqlalchemy import create_engine 2414s from sqlalchemy.engine import Engine 2414s from sqlalchemy.schema import MetaData 2414s 2414s # self.exit_stack cleans up the Engine and Connection and commits the 2414s # transaction if any of those objects was created below. 2414s # Cleanup happens either in self.__exit__ or at the end of the iterator 2414s # returned by read_sql when chunksize is not None. 2414s self.exit_stack = ExitStack() 2414s if isinstance(con, str): 2414s con = create_engine(con) 2414s self.exit_stack.callback(con.dispose) 2414s if isinstance(con, Engine): 2414s > con = self.exit_stack.enter_context(con.connect()) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql_table-postgresql_psycopg2_conn] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = 'postgresql_psycopg2_conn' 2414s request = > 2414s string_storage = 'python', func = 'read_sql_table' 2414s dtype_backend = 'numpy_nullable' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s dtype_backend_expected = .func at 0x79e9ea6444a0> 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table"]) 2414s def test_read_sql_dtype_backend_table( 2414s conn, 2414s request, 2414s string_storage, 2414s func, 2414s dtype_backend, 2414s dtype_backend_data, 2414s dtype_backend_expected, 2414s ): 2414s if "sqlite" in conn and "adbc" not in conn: 2414s request.applymarker( 2414s pytest.mark.xfail( 2414s reason=( 2414s "SQLite actually returns proper boolean values via " 2414s "read_sql_table, but before pytest refactor was skipped" 2414s ) 2414s ) 2414s ) 2414s # GH#50048 2414s conn_name = conn 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3616: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s ___ test_read_sql_invalid_dtype_backend_table[read_sql-mysql_pymysql_engine] ___ 2414s conn = 'mysql_pymysql_engine' 2414s request = > 2414s func = 'read_sql' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table", "read_sql_query"]) 2414s def test_read_sql_invalid_dtype_backend_table(conn, request, func, dtype_backend_data): 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3645: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea673350>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea673350> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s ____ test_read_sql_invalid_dtype_backend_table[read_sql-mysql_pymysql_conn] ____ 2414s conn = 'mysql_pymysql_conn' 2414s request = > 2414s func = 'read_sql' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table", "read_sql_query"]) 2414s def test_read_sql_invalid_dtype_backend_table(conn, request, func, dtype_backend_data): 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3645: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s > fixturedef = request._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea673410>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea673410> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_invalid_dtype_backend_table[read_sql-postgresql_psycopg2_engine] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s request = > 2414s func = 'read_sql' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table", "read_sql_query"]) 2414s def test_read_sql_invalid_dtype_backend_table(conn, request, func, dtype_backend_data): 2414s conn = request.getfixturevalue(conn) 2414s table = "test" 2414s df = dtype_backend_data 2414s > df.to_sql(name=table, con=conn, index=False, if_exists="replace") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3648: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ( a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None,) 2414s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'replace', 'index': False, 'name': 'test'} 2414s 2414s @wraps(func) 2414s def wrapper(*args, **kwargs): 2414s if len(args) > num_allow_args: 2414s warnings.warn( 2414s msg.format(arguments=_format_argument_list(allow_args)), 2414s FutureWarning, 2414s stacklevel=find_stack_level(), 2414s ) 2414s > return func(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None 2414s 2414s @final 2414s @deprecate_nonkeyword_arguments( 2414s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2414s ) 2414s def to_sql( 2414s self, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool_t = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2414s newly created, appended to, or overwritten. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s Name of SQL table. 2414s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. Legacy support is provided for sqlite3.Connection objects. The user 2414s is responsible for engine disposal and connection closure for the SQLAlchemy 2414s connectable. See `here \ 2414s `_. 2414s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2414s the transaction will not be committed. If passing a sqlite3.Connection, 2414s it will not be possible to roll back the record insertion. 2414s 2414s schema : str, optional 2414s Specify the schema (if database flavor supports this). If None, use 2414s default schema. 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s How to behave if the table already exists. 2414s 2414s * fail: Raise a ValueError. 2414s * replace: Drop the table before inserting new values. 2414s * append: Insert new values to the existing table. 2414s 2414s index : bool, default True 2414s Write DataFrame index as a column. Uses `index_label` as the column 2414s name in the table. Creates a table index for this column. 2414s index_label : str or sequence, default None 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s * None : Uses standard SQL ``INSERT`` clause (one per row). 2414s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2414s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s The number of returned rows affected is the sum of the ``rowcount`` 2414s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2414s reflect the exact number of written rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Raises 2414s ------ 2414s ValueError 2414s When the table already exists and `if_exists` is 'fail' (the 2414s default). 2414s 2414s See Also 2414s -------- 2414s read_sql : Read a DataFrame from a table. 2414s 2414s Notes 2414s ----- 2414s Timezone aware datetime columns will be written as 2414s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2414s database. Otherwise, the datetimes will be stored as timezone unaware 2414s timestamps local to the original timezone. 2414s 2414s Not all datastores support ``method="multi"``. Oracle, for example, 2414s does not support multi-value insert. 2414s 2414s References 2414s ---------- 2414s .. [1] https://docs.sqlalchemy.org 2414s .. [2] https://www.python.org/dev/peps/pep-0249/ 2414s 2414s Examples 2414s -------- 2414s Create an in-memory SQLite database. 2414s 2414s >>> from sqlalchemy import create_engine 2414s >>> engine = create_engine('sqlite://', echo=False) 2414s 2414s Create a table from scratch with 3 rows. 2414s 2414s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2414s >>> df 2414s name 2414s 0 User 1 2414s 1 User 2 2414s 2 User 3 2414s 2414s >>> df.to_sql(name='users', con=engine) 2414s 3 2414s >>> from sqlalchemy import text 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2414s 2414s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2414s 2414s >>> with engine.begin() as connection: 2414s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2414s ... df1.to_sql(name='users', con=connection, if_exists='append') 2414s 2 2414s 2414s This is allowed to support operations that require that the same 2414s DBAPI connection is used for the entire operation. 2414s 2414s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2414s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2414s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2414s (1, 'User 7')] 2414s 2414s Overwrite the table with just ``df2``. 2414s 2414s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2414s ... index_label='id') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 6'), (1, 'User 7')] 2414s 2414s Use ``method`` to define a callable insertion method to do nothing 2414s if there's a primary key conflict on a table in a PostgreSQL database. 2414s 2414s >>> from sqlalchemy.dialects.postgresql import insert 2414s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2414s ... # "a" is the primary key in "conflict_table" 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2414s 0 2414s 2414s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2414s on a primary key. 2414s 2414s >>> from sqlalchemy.dialects.mysql import insert 2414s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2414s ... # update columns "b" and "c" on primary key conflict 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = ( 2414s ... insert(table.table) 2414s ... .values(data) 2414s ... ) 2414s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2414s 2 2414s 2414s Specify the dtype (especially useful for integers with missing values). 2414s Notice that while pandas is forced to store the data as floating point, 2414s the database supports nullable integers. When fetching the data with 2414s Python, we get back integer scalars. 2414s 2414s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2414s >>> df 2414s A 2414s 0 1.0 2414s 1 NaN 2414s 2 2.0 2414s 2414s >>> from sqlalchemy.types import Integer 2414s >>> df.to_sql(name='integers', con=engine, index=False, 2414s ... dtype={"A": Integer()}) 2414s 3 2414s 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2414s [(1,), (None,), (2,)] 2414s """ # noqa: E501 2414s from pandas.io import sql 2414s 2414s > return sql.to_sql( 2414s self, 2414s name, 2414s con, 2414s schema=schema, 2414s if_exists=if_exists, 2414s index=index, 2414s index_label=index_label, 2414s chunksize=chunksize, 2414s dtype=dtype, 2414s method=method, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s frame = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None, engine = 'auto' 2414s engine_kwargs = {} 2414s 2414s def to_sql( 2414s frame, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s engine: str = "auto", 2414s **engine_kwargs, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Parameters 2414s ---------- 2414s frame : DataFrame, Series 2414s name : str 2414s Name of SQL table. 2414s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2414s or sqlite3 DBAPI2 connection 2414s ADBC provides high performance I/O with native type support, where available. 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. 2414s If a DBAPI2 object, only sqlite3 is supported. 2414s schema : str, optional 2414s Name of SQL schema in database to write to (if database flavor 2414s supports this). If None, use default schema (default). 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s - fail: If table exists, do nothing. 2414s - replace: If table exists, drop it, recreate it, and insert data. 2414s - append: If table exists, insert data. Create if does not exist. 2414s index : bool, default True 2414s Write DataFrame index as a column. 2414s index_label : str or sequence, optional 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s - None : Uses standard SQL ``INSERT`` clause (one per row). 2414s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2414s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s engine : {'auto', 'sqlalchemy'}, default 'auto' 2414s SQL engine library to use. If 'auto', then the option 2414s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2414s behavior is 'sqlalchemy' 2414s 2414s .. versionadded:: 1.3.0 2414s 2414s **engine_kwargs 2414s Any additional kwargs are passed to the engine. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Notes 2414s ----- 2414s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2414s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2414s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2414s rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__ 2414s """ # noqa: E501 2414s if if_exists not in ("fail", "replace", "append"): 2414s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2414s 2414s if isinstance(frame, Series): 2414s frame = frame.to_frame() 2414s elif not isinstance(frame, DataFrame): 2414s raise NotImplementedError( 2414s "'frame' argument should be either a Series or a DataFrame" 2414s ) 2414s 2414s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def pandasSQL_builder( 2414s con, 2414s schema: str | None = None, 2414s need_transaction: bool = False, 2414s ) -> PandasSQL: 2414s """ 2414s Convenience function to return the correct PandasSQL subclass based on the 2414s provided parameters. Also creates a sqlalchemy connection and transaction 2414s if necessary. 2414s """ 2414s import sqlite3 2414s 2414s if isinstance(con, sqlite3.Connection) or con is None: 2414s return SQLiteDatabase(con) 2414s 2414s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2414s 2414s if isinstance(con, str) and sqlalchemy is None: 2414s raise ImportError("Using URI string without sqlalchemy installed.") 2414s 2414s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2414s > return SQLDatabase(con, schema, need_transaction) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def __init__( 2414s self, con, schema: str | None = None, need_transaction: bool = False 2414s ) -> None: 2414s from sqlalchemy import create_engine 2414s from sqlalchemy.engine import Engine 2414s from sqlalchemy.schema import MetaData 2414s 2414s # self.exit_stack cleans up the Engine and Connection and commits the 2414s # transaction if any of those objects was created below. 2414s # Cleanup happens either in self.__exit__ or at the end of the iterator 2414s # returned by read_sql when chunksize is not None. 2414s self.exit_stack = ExitStack() 2414s if isinstance(con, str): 2414s con = create_engine(con) 2414s self.exit_stack.callback(con.dispose) 2414s if isinstance(con, Engine): 2414s > con = self.exit_stack.enter_context(con.connect()) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_invalid_dtype_backend_table[read_sql-postgresql_psycopg2_conn] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = 'postgresql_psycopg2_conn' 2414s request = > 2414s func = 'read_sql' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table", "read_sql_query"]) 2414s def test_read_sql_invalid_dtype_backend_table(conn, request, func, dtype_backend_data): 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3645: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_invalid_dtype_backend_table[read_sql_table-mysql_pymysql_engine] _ 2414s conn = 'mysql_pymysql_engine' 2414s request = > 2414s func = 'read_sql_table' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table", "read_sql_query"]) 2414s def test_read_sql_invalid_dtype_backend_table(conn, request, func, dtype_backend_data): 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3645: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea673ad0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea673ad0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_invalid_dtype_backend_table[read_sql_table-mysql_pymysql_conn] _ 2414s conn = 'mysql_pymysql_conn' 2414s request = > 2414s func = 'read_sql_table' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table", "read_sql_query"]) 2414s def test_read_sql_invalid_dtype_backend_table(conn, request, func, dtype_backend_data): 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3645: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s > fixturedef = request._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea673b90>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea673b90> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_invalid_dtype_backend_table[read_sql_table-postgresql_psycopg2_engine] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s request = > 2414s func = 'read_sql_table' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table", "read_sql_query"]) 2414s def test_read_sql_invalid_dtype_backend_table(conn, request, func, dtype_backend_data): 2414s conn = request.getfixturevalue(conn) 2414s table = "test" 2414s df = dtype_backend_data 2414s > df.to_sql(name=table, con=conn, index=False, if_exists="replace") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3648: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ( a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None,) 2414s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'replace', 'index': False, 'name': 'test'} 2414s 2414s @wraps(func) 2414s def wrapper(*args, **kwargs): 2414s if len(args) > num_allow_args: 2414s warnings.warn( 2414s msg.format(arguments=_format_argument_list(allow_args)), 2414s FutureWarning, 2414s stacklevel=find_stack_level(), 2414s ) 2414s > return func(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None 2414s 2414s @final 2414s @deprecate_nonkeyword_arguments( 2414s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2414s ) 2414s def to_sql( 2414s self, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool_t = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2414s newly created, appended to, or overwritten. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s Name of SQL table. 2414s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. Legacy support is provided for sqlite3.Connection objects. The user 2414s is responsible for engine disposal and connection closure for the SQLAlchemy 2414s connectable. See `here \ 2414s `_. 2414s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2414s the transaction will not be committed. If passing a sqlite3.Connection, 2414s it will not be possible to roll back the record insertion. 2414s 2414s schema : str, optional 2414s Specify the schema (if database flavor supports this). If None, use 2414s default schema. 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s How to behave if the table already exists. 2414s 2414s * fail: Raise a ValueError. 2414s * replace: Drop the table before inserting new values. 2414s * append: Insert new values to the existing table. 2414s 2414s index : bool, default True 2414s Write DataFrame index as a column. Uses `index_label` as the column 2414s name in the table. Creates a table index for this column. 2414s index_label : str or sequence, default None 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s * None : Uses standard SQL ``INSERT`` clause (one per row). 2414s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2414s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s The number of returned rows affected is the sum of the ``rowcount`` 2414s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2414s reflect the exact number of written rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Raises 2414s ------ 2414s ValueError 2414s When the table already exists and `if_exists` is 'fail' (the 2414s default). 2414s 2414s See Also 2414s -------- 2414s read_sql : Read a DataFrame from a table. 2414s 2414s Notes 2414s ----- 2414s Timezone aware datetime columns will be written as 2414s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2414s database. Otherwise, the datetimes will be stored as timezone unaware 2414s timestamps local to the original timezone. 2414s 2414s Not all datastores support ``method="multi"``. Oracle, for example, 2414s does not support multi-value insert. 2414s 2414s References 2414s ---------- 2414s .. [1] https://docs.sqlalchemy.org 2414s .. [2] https://www.python.org/dev/peps/pep-0249/ 2414s 2414s Examples 2414s -------- 2414s Create an in-memory SQLite database. 2414s 2414s >>> from sqlalchemy import create_engine 2414s >>> engine = create_engine('sqlite://', echo=False) 2414s 2414s Create a table from scratch with 3 rows. 2414s 2414s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2414s >>> df 2414s name 2414s 0 User 1 2414s 1 User 2 2414s 2 User 3 2414s 2414s >>> df.to_sql(name='users', con=engine) 2414s 3 2414s >>> from sqlalchemy import text 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2414s 2414s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2414s 2414s >>> with engine.begin() as connection: 2414s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2414s ... df1.to_sql(name='users', con=connection, if_exists='append') 2414s 2 2414s 2414s This is allowed to support operations that require that the same 2414s DBAPI connection is used for the entire operation. 2414s 2414s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2414s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2414s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2414s (1, 'User 7')] 2414s 2414s Overwrite the table with just ``df2``. 2414s 2414s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2414s ... index_label='id') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 6'), (1, 'User 7')] 2414s 2414s Use ``method`` to define a callable insertion method to do nothing 2414s if there's a primary key conflict on a table in a PostgreSQL database. 2414s 2414s >>> from sqlalchemy.dialects.postgresql import insert 2414s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2414s ... # "a" is the primary key in "conflict_table" 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2414s 0 2414s 2414s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2414s on a primary key. 2414s 2414s >>> from sqlalchemy.dialects.mysql import insert 2414s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2414s ... # update columns "b" and "c" on primary key conflict 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = ( 2414s ... insert(table.table) 2414s ... .values(data) 2414s ... ) 2414s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2414s 2 2414s 2414s Specify the dtype (especially useful for integers with missing values). 2414s Notice that while pandas is forced to store the data as floating point, 2414s the database supports nullable integers. When fetching the data with 2414s Python, we get back integer scalars. 2414s 2414s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2414s >>> df 2414s A 2414s 0 1.0 2414s 1 NaN 2414s 2 2.0 2414s 2414s >>> from sqlalchemy.types import Integer 2414s >>> df.to_sql(name='integers', con=engine, index=False, 2414s ... dtype={"A": Integer()}) 2414s 3 2414s 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2414s [(1,), (None,), (2,)] 2414s """ # noqa: E501 2414s from pandas.io import sql 2414s 2414s > return sql.to_sql( 2414s self, 2414s name, 2414s con, 2414s schema=schema, 2414s if_exists=if_exists, 2414s index=index, 2414s index_label=index_label, 2414s chunksize=chunksize, 2414s dtype=dtype, 2414s method=method, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s frame = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None, engine = 'auto' 2414s engine_kwargs = {} 2414s 2414s def to_sql( 2414s frame, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s engine: str = "auto", 2414s **engine_kwargs, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Parameters 2414s ---------- 2414s frame : DataFrame, Series 2414s name : str 2414s Name of SQL table. 2414s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2414s or sqlite3 DBAPI2 connection 2414s ADBC provides high performance I/O with native type support, where available. 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. 2414s If a DBAPI2 object, only sqlite3 is supported. 2414s schema : str, optional 2414s Name of SQL schema in database to write to (if database flavor 2414s supports this). If None, use default schema (default). 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s - fail: If table exists, do nothing. 2414s - replace: If table exists, drop it, recreate it, and insert data. 2414s - append: If table exists, insert data. Create if does not exist. 2414s index : bool, default True 2414s Write DataFrame index as a column. 2414s index_label : str or sequence, optional 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s - None : Uses standard SQL ``INSERT`` clause (one per row). 2414s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2414s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s engine : {'auto', 'sqlalchemy'}, default 'auto' 2414s SQL engine library to use. If 'auto', then the option 2414s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2414s behavior is 'sqlalchemy' 2414s 2414s .. versionadded:: 1.3.0 2414s 2414s **engine_kwargs 2414s Any additional kwargs are passed to the engine. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Notes 2414s ----- 2414s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2414s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2414s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2414s rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__ 2414s """ # noqa: E501 2414s if if_exists not in ("fail", "replace", "append"): 2414s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2414s 2414s if isinstance(frame, Series): 2414s frame = frame.to_frame() 2414s elif not isinstance(frame, DataFrame): 2414s raise NotImplementedError( 2414s "'frame' argument should be either a Series or a DataFrame" 2414s ) 2414s 2414s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def pandasSQL_builder( 2414s con, 2414s schema: str | None = None, 2414s need_transaction: bool = False, 2414s ) -> PandasSQL: 2414s """ 2414s Convenience function to return the correct PandasSQL subclass based on the 2414s provided parameters. Also creates a sqlalchemy connection and transaction 2414s if necessary. 2414s """ 2414s import sqlite3 2414s 2414s if isinstance(con, sqlite3.Connection) or con is None: 2414s return SQLiteDatabase(con) 2414s 2414s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2414s 2414s if isinstance(con, str) and sqlalchemy is None: 2414s raise ImportError("Using URI string without sqlalchemy installed.") 2414s 2414s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2414s > return SQLDatabase(con, schema, need_transaction) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def __init__( 2414s self, con, schema: str | None = None, need_transaction: bool = False 2414s ) -> None: 2414s from sqlalchemy import create_engine 2414s from sqlalchemy.engine import Engine 2414s from sqlalchemy.schema import MetaData 2414s 2414s # self.exit_stack cleans up the Engine and Connection and commits the 2414s # transaction if any of those objects was created below. 2414s # Cleanup happens either in self.__exit__ or at the end of the iterator 2414s # returned by read_sql when chunksize is not None. 2414s self.exit_stack = ExitStack() 2414s if isinstance(con, str): 2414s con = create_engine(con) 2414s self.exit_stack.callback(con.dispose) 2414s if isinstance(con, Engine): 2414s > con = self.exit_stack.enter_context(con.connect()) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_invalid_dtype_backend_table[read_sql_table-postgresql_psycopg2_conn] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = 'postgresql_psycopg2_conn' 2414s request = > 2414s func = 'read_sql_table' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table", "read_sql_query"]) 2414s def test_read_sql_invalid_dtype_backend_table(conn, request, func, dtype_backend_data): 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3645: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_invalid_dtype_backend_table[read_sql_query-mysql_pymysql_engine] _ 2414s conn = 'mysql_pymysql_engine' 2414s request = > 2414s func = 'read_sql_query' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table", "read_sql_query"]) 2414s def test_read_sql_invalid_dtype_backend_table(conn, request, func, dtype_backend_data): 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3645: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea673ef0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea673ef0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_invalid_dtype_backend_table[read_sql_query-mysql_pymysql_conn] _ 2414s conn = 'mysql_pymysql_conn' 2414s request = > 2414s func = 'read_sql_query' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table", "read_sql_query"]) 2414s def test_read_sql_invalid_dtype_backend_table(conn, request, func, dtype_backend_data): 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3645: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s > fixturedef = request._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea673ef0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea673ef0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x406348f0, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_invalid_dtype_backend_table[read_sql_query-postgresql_psycopg2_engine] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s request = > 2414s func = 'read_sql_query' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table", "read_sql_query"]) 2414s def test_read_sql_invalid_dtype_backend_table(conn, request, func, dtype_backend_data): 2414s conn = request.getfixturevalue(conn) 2414s table = "test" 2414s df = dtype_backend_data 2414s > df.to_sql(name=table, con=conn, index=False, if_exists="replace") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3648: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ( a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None,) 2414s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'replace', 'index': False, 'name': 'test'} 2414s 2414s @wraps(func) 2414s def wrapper(*args, **kwargs): 2414s if len(args) > num_allow_args: 2414s warnings.warn( 2414s msg.format(arguments=_format_argument_list(allow_args)), 2414s FutureWarning, 2414s stacklevel=find_stack_level(), 2414s ) 2414s > return func(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None 2414s 2414s @final 2414s @deprecate_nonkeyword_arguments( 2414s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2414s ) 2414s def to_sql( 2414s self, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool_t = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2414s newly created, appended to, or overwritten. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s Name of SQL table. 2414s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. Legacy support is provided for sqlite3.Connection objects. The user 2414s is responsible for engine disposal and connection closure for the SQLAlchemy 2414s connectable. See `here \ 2414s `_. 2414s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2414s the transaction will not be committed. If passing a sqlite3.Connection, 2414s it will not be possible to roll back the record insertion. 2414s 2414s schema : str, optional 2414s Specify the schema (if database flavor supports this). If None, use 2414s default schema. 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s How to behave if the table already exists. 2414s 2414s * fail: Raise a ValueError. 2414s * replace: Drop the table before inserting new values. 2414s * append: Insert new values to the existing table. 2414s 2414s index : bool, default True 2414s Write DataFrame index as a column. Uses `index_label` as the column 2414s name in the table. Creates a table index for this column. 2414s index_label : str or sequence, default None 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s * None : Uses standard SQL ``INSERT`` clause (one per row). 2414s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2414s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s The number of returned rows affected is the sum of the ``rowcount`` 2414s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2414s reflect the exact number of written rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Raises 2414s ------ 2414s ValueError 2414s When the table already exists and `if_exists` is 'fail' (the 2414s default). 2414s 2414s See Also 2414s -------- 2414s read_sql : Read a DataFrame from a table. 2414s 2414s Notes 2414s ----- 2414s Timezone aware datetime columns will be written as 2414s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2414s database. Otherwise, the datetimes will be stored as timezone unaware 2414s timestamps local to the original timezone. 2414s 2414s Not all datastores support ``method="multi"``. Oracle, for example, 2414s does not support multi-value insert. 2414s 2414s References 2414s ---------- 2414s .. [1] https://docs.sqlalchemy.org 2414s .. [2] https://www.python.org/dev/peps/pep-0249/ 2414s 2414s Examples 2414s -------- 2414s Create an in-memory SQLite database. 2414s 2414s >>> from sqlalchemy import create_engine 2414s >>> engine = create_engine('sqlite://', echo=False) 2414s 2414s Create a table from scratch with 3 rows. 2414s 2414s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2414s >>> df 2414s name 2414s 0 User 1 2414s 1 User 2 2414s 2 User 3 2414s 2414s >>> df.to_sql(name='users', con=engine) 2414s 3 2414s >>> from sqlalchemy import text 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2414s 2414s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2414s 2414s >>> with engine.begin() as connection: 2414s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2414s ... df1.to_sql(name='users', con=connection, if_exists='append') 2414s 2 2414s 2414s This is allowed to support operations that require that the same 2414s DBAPI connection is used for the entire operation. 2414s 2414s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2414s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2414s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2414s (1, 'User 7')] 2414s 2414s Overwrite the table with just ``df2``. 2414s 2414s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2414s ... index_label='id') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 6'), (1, 'User 7')] 2414s 2414s Use ``method`` to define a callable insertion method to do nothing 2414s if there's a primary key conflict on a table in a PostgreSQL database. 2414s 2414s >>> from sqlalchemy.dialects.postgresql import insert 2414s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2414s ... # "a" is the primary key in "conflict_table" 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2414s 0 2414s 2414s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2414s on a primary key. 2414s 2414s >>> from sqlalchemy.dialects.mysql import insert 2414s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2414s ... # update columns "b" and "c" on primary key conflict 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = ( 2414s ... insert(table.table) 2414s ... .values(data) 2414s ... ) 2414s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2414s 2 2414s 2414s Specify the dtype (especially useful for integers with missing values). 2414s Notice that while pandas is forced to store the data as floating point, 2414s the database supports nullable integers. When fetching the data with 2414s Python, we get back integer scalars. 2414s 2414s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2414s >>> df 2414s A 2414s 0 1.0 2414s 1 NaN 2414s 2 2.0 2414s 2414s >>> from sqlalchemy.types import Integer 2414s >>> df.to_sql(name='integers', con=engine, index=False, 2414s ... dtype={"A": Integer()}) 2414s 3 2414s 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2414s [(1,), (None,), (2,)] 2414s """ # noqa: E501 2414s from pandas.io import sql 2414s 2414s > return sql.to_sql( 2414s self, 2414s name, 2414s con, 2414s schema=schema, 2414s if_exists=if_exists, 2414s index=index, 2414s index_label=index_label, 2414s chunksize=chunksize, 2414s dtype=dtype, 2414s method=method, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s frame = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None, engine = 'auto' 2414s engine_kwargs = {} 2414s 2414s def to_sql( 2414s frame, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s engine: str = "auto", 2414s **engine_kwargs, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Parameters 2414s ---------- 2414s frame : DataFrame, Series 2414s name : str 2414s Name of SQL table. 2414s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2414s or sqlite3 DBAPI2 connection 2414s ADBC provides high performance I/O with native type support, where available. 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. 2414s If a DBAPI2 object, only sqlite3 is supported. 2414s schema : str, optional 2414s Name of SQL schema in database to write to (if database flavor 2414s supports this). If None, use default schema (default). 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s - fail: If table exists, do nothing. 2414s - replace: If table exists, drop it, recreate it, and insert data. 2414s - append: If table exists, insert data. Create if does not exist. 2414s index : bool, default True 2414s Write DataFrame index as a column. 2414s index_label : str or sequence, optional 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s - None : Uses standard SQL ``INSERT`` clause (one per row). 2414s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2414s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s engine : {'auto', 'sqlalchemy'}, default 'auto' 2414s SQL engine library to use. If 'auto', then the option 2414s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2414s behavior is 'sqlalchemy' 2414s 2414s .. versionadded:: 1.3.0 2414s 2414s **engine_kwargs 2414s Any additional kwargs are passed to the engine. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Notes 2414s ----- 2414s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2414s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2414s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2414s rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__ 2414s """ # noqa: E501 2414s if if_exists not in ("fail", "replace", "append"): 2414s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2414s 2414s if isinstance(frame, Series): 2414s frame = frame.to_frame() 2414s elif not isinstance(frame, DataFrame): 2414s raise NotImplementedError( 2414s "'frame' argument should be either a Series or a DataFrame" 2414s ) 2414s 2414s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def pandasSQL_builder( 2414s con, 2414s schema: str | None = None, 2414s need_transaction: bool = False, 2414s ) -> PandasSQL: 2414s """ 2414s Convenience function to return the correct PandasSQL subclass based on the 2414s provided parameters. Also creates a sqlalchemy connection and transaction 2414s if necessary. 2414s """ 2414s import sqlite3 2414s 2414s if isinstance(con, sqlite3.Connection) or con is None: 2414s return SQLiteDatabase(con) 2414s 2414s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2414s 2414s if isinstance(con, str) and sqlalchemy is None: 2414s raise ImportError("Using URI string without sqlalchemy installed.") 2414s 2414s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2414s > return SQLDatabase(con, schema, need_transaction) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def __init__( 2414s self, con, schema: str | None = None, need_transaction: bool = False 2414s ) -> None: 2414s from sqlalchemy import create_engine 2414s from sqlalchemy.engine import Engine 2414s from sqlalchemy.schema import MetaData 2414s 2414s # self.exit_stack cleans up the Engine and Connection and commits the 2414s # transaction if any of those objects was created below. 2414s # Cleanup happens either in self.__exit__ or at the end of the iterator 2414s # returned by read_sql when chunksize is not None. 2414s self.exit_stack = ExitStack() 2414s if isinstance(con, str): 2414s con = create_engine(con) 2414s self.exit_stack.callback(con.dispose) 2414s if isinstance(con, Engine): 2414s > con = self.exit_stack.enter_context(con.connect()) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_invalid_dtype_backend_table[read_sql_query-postgresql_psycopg2_conn] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = 'postgresql_psycopg2_conn' 2414s request = > 2414s func = 'read_sql_query' 2414s dtype_backend_data = a b c d e f g h 2414s 0 1 1 1.5 1.5 True True a a 2414s 1 2 2.0 False False b b 2414s 2 3 3 2.5 2.5 None True c None 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_table", "read_sql_query"]) 2414s def test_read_sql_invalid_dtype_backend_table(conn, request, func, dtype_backend_data): 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3645: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s ______________ test_chunksize_empty_dtypes[mysql_pymysql_engine] _______________ 2414s conn = 'mysql_pymysql_engine' 2414s request = > 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s def test_chunksize_empty_dtypes(conn, request): 2414s # GH#50245 2414s if "adbc" in conn: 2414s request.node.add_marker( 2414s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2414s ) 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3737: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bc3b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bc3b0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _______________ test_chunksize_empty_dtypes[mysql_pymysql_conn] ________________ 2414s conn = 'mysql_pymysql_conn' 2414s request = > 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s def test_chunksize_empty_dtypes(conn, request): 2414s # GH#50245 2414s if "adbc" in conn: 2414s request.node.add_marker( 2414s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2414s ) 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3737: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s > fixturedef = request._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bc4d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bc4d0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s ___________ test_chunksize_empty_dtypes[postgresql_psycopg2_engine] ____________ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s request = > 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s def test_chunksize_empty_dtypes(conn, request): 2414s # GH#50245 2414s if "adbc" in conn: 2414s request.node.add_marker( 2414s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2414s ) 2414s conn = request.getfixturevalue(conn) 2414s dtypes = {"a": "int64", "b": "object"} 2414s df = DataFrame(columns=["a", "b"]).astype(dtypes) 2414s expected = df.copy() 2414s > df.to_sql(name="test", con=conn, index=False, if_exists="replace") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3741: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = (Empty DataFrame 2414s Columns: [a, b] 2414s Index: [],) 2414s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'replace', 'index': False, 'name': 'test'} 2414s 2414s @wraps(func) 2414s def wrapper(*args, **kwargs): 2414s if len(args) > num_allow_args: 2414s warnings.warn( 2414s msg.format(arguments=_format_argument_list(allow_args)), 2414s FutureWarning, 2414s stacklevel=find_stack_level(), 2414s ) 2414s > return func(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Empty DataFrame 2414s Columns: [a, b] 2414s Index: [], name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None 2414s 2414s @final 2414s @deprecate_nonkeyword_arguments( 2414s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2414s ) 2414s def to_sql( 2414s self, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool_t = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2414s newly created, appended to, or overwritten. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s Name of SQL table. 2414s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. Legacy support is provided for sqlite3.Connection objects. The user 2414s is responsible for engine disposal and connection closure for the SQLAlchemy 2414s connectable. See `here \ 2414s `_. 2414s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2414s the transaction will not be committed. If passing a sqlite3.Connection, 2414s it will not be possible to roll back the record insertion. 2414s 2414s schema : str, optional 2414s Specify the schema (if database flavor supports this). If None, use 2414s default schema. 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s How to behave if the table already exists. 2414s 2414s * fail: Raise a ValueError. 2414s * replace: Drop the table before inserting new values. 2414s * append: Insert new values to the existing table. 2414s 2414s index : bool, default True 2414s Write DataFrame index as a column. Uses `index_label` as the column 2414s name in the table. Creates a table index for this column. 2414s index_label : str or sequence, default None 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s * None : Uses standard SQL ``INSERT`` clause (one per row). 2414s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2414s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s The number of returned rows affected is the sum of the ``rowcount`` 2414s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2414s reflect the exact number of written rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Raises 2414s ------ 2414s ValueError 2414s When the table already exists and `if_exists` is 'fail' (the 2414s default). 2414s 2414s See Also 2414s -------- 2414s read_sql : Read a DataFrame from a table. 2414s 2414s Notes 2414s ----- 2414s Timezone aware datetime columns will be written as 2414s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2414s database. Otherwise, the datetimes will be stored as timezone unaware 2414s timestamps local to the original timezone. 2414s 2414s Not all datastores support ``method="multi"``. Oracle, for example, 2414s does not support multi-value insert. 2414s 2414s References 2414s ---------- 2414s .. [1] https://docs.sqlalchemy.org 2414s .. [2] https://www.python.org/dev/peps/pep-0249/ 2414s 2414s Examples 2414s -------- 2414s Create an in-memory SQLite database. 2414s 2414s >>> from sqlalchemy import create_engine 2414s >>> engine = create_engine('sqlite://', echo=False) 2414s 2414s Create a table from scratch with 3 rows. 2414s 2414s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2414s >>> df 2414s name 2414s 0 User 1 2414s 1 User 2 2414s 2 User 3 2414s 2414s >>> df.to_sql(name='users', con=engine) 2414s 3 2414s >>> from sqlalchemy import text 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2414s 2414s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2414s 2414s >>> with engine.begin() as connection: 2414s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2414s ... df1.to_sql(name='users', con=connection, if_exists='append') 2414s 2 2414s 2414s This is allowed to support operations that require that the same 2414s DBAPI connection is used for the entire operation. 2414s 2414s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2414s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2414s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2414s (1, 'User 7')] 2414s 2414s Overwrite the table with just ``df2``. 2414s 2414s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2414s ... index_label='id') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 6'), (1, 'User 7')] 2414s 2414s Use ``method`` to define a callable insertion method to do nothing 2414s if there's a primary key conflict on a table in a PostgreSQL database. 2414s 2414s >>> from sqlalchemy.dialects.postgresql import insert 2414s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2414s ... # "a" is the primary key in "conflict_table" 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2414s 0 2414s 2414s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2414s on a primary key. 2414s 2414s >>> from sqlalchemy.dialects.mysql import insert 2414s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2414s ... # update columns "b" and "c" on primary key conflict 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = ( 2414s ... insert(table.table) 2414s ... .values(data) 2414s ... ) 2414s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2414s 2 2414s 2414s Specify the dtype (especially useful for integers with missing values). 2414s Notice that while pandas is forced to store the data as floating point, 2414s the database supports nullable integers. When fetching the data with 2414s Python, we get back integer scalars. 2414s 2414s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2414s >>> df 2414s A 2414s 0 1.0 2414s 1 NaN 2414s 2 2.0 2414s 2414s >>> from sqlalchemy.types import Integer 2414s >>> df.to_sql(name='integers', con=engine, index=False, 2414s ... dtype={"A": Integer()}) 2414s 3 2414s 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2414s [(1,), (None,), (2,)] 2414s """ # noqa: E501 2414s from pandas.io import sql 2414s 2414s > return sql.to_sql( 2414s self, 2414s name, 2414s con, 2414s schema=schema, 2414s if_exists=if_exists, 2414s index=index, 2414s index_label=index_label, 2414s chunksize=chunksize, 2414s dtype=dtype, 2414s method=method, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s frame = Empty DataFrame 2414s Columns: [a, b] 2414s Index: [], name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None, engine = 'auto' 2414s engine_kwargs = {} 2414s 2414s def to_sql( 2414s frame, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s engine: str = "auto", 2414s **engine_kwargs, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Parameters 2414s ---------- 2414s frame : DataFrame, Series 2414s name : str 2414s Name of SQL table. 2414s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2414s or sqlite3 DBAPI2 connection 2414s ADBC provides high performance I/O with native type support, where available. 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. 2414s If a DBAPI2 object, only sqlite3 is supported. 2414s schema : str, optional 2414s Name of SQL schema in database to write to (if database flavor 2414s supports this). If None, use default schema (default). 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s - fail: If table exists, do nothing. 2414s - replace: If table exists, drop it, recreate it, and insert data. 2414s - append: If table exists, insert data. Create if does not exist. 2414s index : bool, default True 2414s Write DataFrame index as a column. 2414s index_label : str or sequence, optional 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s - None : Uses standard SQL ``INSERT`` clause (one per row). 2414s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2414s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s engine : {'auto', 'sqlalchemy'}, default 'auto' 2414s SQL engine library to use. If 'auto', then the option 2414s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2414s behavior is 'sqlalchemy' 2414s 2414s .. versionadded:: 1.3.0 2414s 2414s **engine_kwargs 2414s Any additional kwargs are passed to the engine. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Notes 2414s ----- 2414s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2414s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2414s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2414s rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__ 2414s """ # noqa: E501 2414s if if_exists not in ("fail", "replace", "append"): 2414s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2414s 2414s if isinstance(frame, Series): 2414s frame = frame.to_frame() 2414s elif not isinstance(frame, DataFrame): 2414s raise NotImplementedError( 2414s "'frame' argument should be either a Series or a DataFrame" 2414s ) 2414s 2414s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def pandasSQL_builder( 2414s con, 2414s schema: str | None = None, 2414s need_transaction: bool = False, 2414s ) -> PandasSQL: 2414s """ 2414s Convenience function to return the correct PandasSQL subclass based on the 2414s provided parameters. Also creates a sqlalchemy connection and transaction 2414s if necessary. 2414s """ 2414s import sqlite3 2414s 2414s if isinstance(con, sqlite3.Connection) or con is None: 2414s return SQLiteDatabase(con) 2414s 2414s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2414s 2414s if isinstance(con, str) and sqlalchemy is None: 2414s raise ImportError("Using URI string without sqlalchemy installed.") 2414s 2414s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2414s > return SQLDatabase(con, schema, need_transaction) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def __init__( 2414s self, con, schema: str | None = None, need_transaction: bool = False 2414s ) -> None: 2414s from sqlalchemy import create_engine 2414s from sqlalchemy.engine import Engine 2414s from sqlalchemy.schema import MetaData 2414s 2414s # self.exit_stack cleans up the Engine and Connection and commits the 2414s # transaction if any of those objects was created below. 2414s # Cleanup happens either in self.__exit__ or at the end of the iterator 2414s # returned by read_sql when chunksize is not None. 2414s self.exit_stack = ExitStack() 2414s if isinstance(con, str): 2414s con = create_engine(con) 2414s self.exit_stack.callback(con.dispose) 2414s if isinstance(con, Engine): 2414s > con = self.exit_stack.enter_context(con.connect()) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s ____________ test_chunksize_empty_dtypes[postgresql_psycopg2_conn] _____________ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = 'postgresql_psycopg2_conn' 2414s request = > 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s def test_chunksize_empty_dtypes(conn, request): 2414s # GH#50245 2414s if "adbc" in conn: 2414s request.node.add_marker( 2414s pytest.mark.xfail(reason="chunksize argument NotImplemented with ADBC") 2414s ) 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3737: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s ___ test_read_sql_dtype[read_sql-_NoDefault.no_default-mysql_pymysql_engine] ___ 2414s conn = 'mysql_pymysql_engine' 2414s request = > 2414s func = 'read_sql', dtype_backend = 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3757: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bcef0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bcef0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s ____ test_read_sql_dtype[read_sql-_NoDefault.no_default-mysql_pymysql_conn] ____ 2414s conn = 'mysql_pymysql_conn' 2414s request = > 2414s func = 'read_sql', dtype_backend = 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3757: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s > fixturedef = request._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bd010>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bd010> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_dtype[read_sql-_NoDefault.no_default-postgresql_psycopg2_engine] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s request = > 2414s func = 'read_sql', dtype_backend = 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s conn = request.getfixturevalue(conn) 2414s table = "test" 2414s df = DataFrame({"a": [1, 2, 3], "b": 5}) 2414s > df.to_sql(name=table, con=conn, index=False, if_exists="replace") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3760: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ( a b 2414s 0 1 5 2414s 1 2 5 2414s 2 3 5,) 2414s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'replace', 'index': False, 'name': 'test'} 2414s 2414s @wraps(func) 2414s def wrapper(*args, **kwargs): 2414s if len(args) > num_allow_args: 2414s warnings.warn( 2414s msg.format(arguments=_format_argument_list(allow_args)), 2414s FutureWarning, 2414s stacklevel=find_stack_level(), 2414s ) 2414s > return func(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = a b 2414s 0 1 5 2414s 1 2 5 2414s 2 3 5, name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None 2414s 2414s @final 2414s @deprecate_nonkeyword_arguments( 2414s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2414s ) 2414s def to_sql( 2414s self, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool_t = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2414s newly created, appended to, or overwritten. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s Name of SQL table. 2414s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. Legacy support is provided for sqlite3.Connection objects. The user 2414s is responsible for engine disposal and connection closure for the SQLAlchemy 2414s connectable. See `here \ 2414s `_. 2414s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2414s the transaction will not be committed. If passing a sqlite3.Connection, 2414s it will not be possible to roll back the record insertion. 2414s 2414s schema : str, optional 2414s Specify the schema (if database flavor supports this). If None, use 2414s default schema. 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s How to behave if the table already exists. 2414s 2414s * fail: Raise a ValueError. 2414s * replace: Drop the table before inserting new values. 2414s * append: Insert new values to the existing table. 2414s 2414s index : bool, default True 2414s Write DataFrame index as a column. Uses `index_label` as the column 2414s name in the table. Creates a table index for this column. 2414s index_label : str or sequence, default None 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s * None : Uses standard SQL ``INSERT`` clause (one per row). 2414s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2414s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s The number of returned rows affected is the sum of the ``rowcount`` 2414s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2414s reflect the exact number of written rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Raises 2414s ------ 2414s ValueError 2414s When the table already exists and `if_exists` is 'fail' (the 2414s default). 2414s 2414s See Also 2414s -------- 2414s read_sql : Read a DataFrame from a table. 2414s 2414s Notes 2414s ----- 2414s Timezone aware datetime columns will be written as 2414s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2414s database. Otherwise, the datetimes will be stored as timezone unaware 2414s timestamps local to the original timezone. 2414s 2414s Not all datastores support ``method="multi"``. Oracle, for example, 2414s does not support multi-value insert. 2414s 2414s References 2414s ---------- 2414s .. [1] https://docs.sqlalchemy.org 2414s .. [2] https://www.python.org/dev/peps/pep-0249/ 2414s 2414s Examples 2414s -------- 2414s Create an in-memory SQLite database. 2414s 2414s >>> from sqlalchemy import create_engine 2414s >>> engine = create_engine('sqlite://', echo=False) 2414s 2414s Create a table from scratch with 3 rows. 2414s 2414s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2414s >>> df 2414s name 2414s 0 User 1 2414s 1 User 2 2414s 2 User 3 2414s 2414s >>> df.to_sql(name='users', con=engine) 2414s 3 2414s >>> from sqlalchemy import text 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2414s 2414s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2414s 2414s >>> with engine.begin() as connection: 2414s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2414s ... df1.to_sql(name='users', con=connection, if_exists='append') 2414s 2 2414s 2414s This is allowed to support operations that require that the same 2414s DBAPI connection is used for the entire operation. 2414s 2414s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2414s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2414s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2414s (1, 'User 7')] 2414s 2414s Overwrite the table with just ``df2``. 2414s 2414s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2414s ... index_label='id') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 6'), (1, 'User 7')] 2414s 2414s Use ``method`` to define a callable insertion method to do nothing 2414s if there's a primary key conflict on a table in a PostgreSQL database. 2414s 2414s >>> from sqlalchemy.dialects.postgresql import insert 2414s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2414s ... # "a" is the primary key in "conflict_table" 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2414s 0 2414s 2414s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2414s on a primary key. 2414s 2414s >>> from sqlalchemy.dialects.mysql import insert 2414s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2414s ... # update columns "b" and "c" on primary key conflict 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = ( 2414s ... insert(table.table) 2414s ... .values(data) 2414s ... ) 2414s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2414s 2 2414s 2414s Specify the dtype (especially useful for integers with missing values). 2414s Notice that while pandas is forced to store the data as floating point, 2414s the database supports nullable integers. When fetching the data with 2414s Python, we get back integer scalars. 2414s 2414s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2414s >>> df 2414s A 2414s 0 1.0 2414s 1 NaN 2414s 2 2.0 2414s 2414s >>> from sqlalchemy.types import Integer 2414s >>> df.to_sql(name='integers', con=engine, index=False, 2414s ... dtype={"A": Integer()}) 2414s 3 2414s 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2414s [(1,), (None,), (2,)] 2414s """ # noqa: E501 2414s from pandas.io import sql 2414s 2414s > return sql.to_sql( 2414s self, 2414s name, 2414s con, 2414s schema=schema, 2414s if_exists=if_exists, 2414s index=index, 2414s index_label=index_label, 2414s chunksize=chunksize, 2414s dtype=dtype, 2414s method=method, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s frame = a b 2414s 0 1 5 2414s 1 2 5 2414s 2 3 5, name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None, engine = 'auto' 2414s engine_kwargs = {} 2414s 2414s def to_sql( 2414s frame, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s engine: str = "auto", 2414s **engine_kwargs, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Parameters 2414s ---------- 2414s frame : DataFrame, Series 2414s name : str 2414s Name of SQL table. 2414s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2414s or sqlite3 DBAPI2 connection 2414s ADBC provides high performance I/O with native type support, where available. 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. 2414s If a DBAPI2 object, only sqlite3 is supported. 2414s schema : str, optional 2414s Name of SQL schema in database to write to (if database flavor 2414s supports this). If None, use default schema (default). 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s - fail: If table exists, do nothing. 2414s - replace: If table exists, drop it, recreate it, and insert data. 2414s - append: If table exists, insert data. Create if does not exist. 2414s index : bool, default True 2414s Write DataFrame index as a column. 2414s index_label : str or sequence, optional 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s - None : Uses standard SQL ``INSERT`` clause (one per row). 2414s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2414s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s engine : {'auto', 'sqlalchemy'}, default 'auto' 2414s SQL engine library to use. If 'auto', then the option 2414s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2414s behavior is 'sqlalchemy' 2414s 2414s .. versionadded:: 1.3.0 2414s 2414s **engine_kwargs 2414s Any additional kwargs are passed to the engine. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Notes 2414s ----- 2414s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2414s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2414s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2414s rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__ 2414s """ # noqa: E501 2414s if if_exists not in ("fail", "replace", "append"): 2414s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2414s 2414s if isinstance(frame, Series): 2414s frame = frame.to_frame() 2414s elif not isinstance(frame, DataFrame): 2414s raise NotImplementedError( 2414s "'frame' argument should be either a Series or a DataFrame" 2414s ) 2414s 2414s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def pandasSQL_builder( 2414s con, 2414s schema: str | None = None, 2414s need_transaction: bool = False, 2414s ) -> PandasSQL: 2414s """ 2414s Convenience function to return the correct PandasSQL subclass based on the 2414s provided parameters. Also creates a sqlalchemy connection and transaction 2414s if necessary. 2414s """ 2414s import sqlite3 2414s 2414s if isinstance(con, sqlite3.Connection) or con is None: 2414s return SQLiteDatabase(con) 2414s 2414s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2414s 2414s if isinstance(con, str) and sqlalchemy is None: 2414s raise ImportError("Using URI string without sqlalchemy installed.") 2414s 2414s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2414s > return SQLDatabase(con, schema, need_transaction) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def __init__( 2414s self, con, schema: str | None = None, need_transaction: bool = False 2414s ) -> None: 2414s from sqlalchemy import create_engine 2414s from sqlalchemy.engine import Engine 2414s from sqlalchemy.schema import MetaData 2414s 2414s # self.exit_stack cleans up the Engine and Connection and commits the 2414s # transaction if any of those objects was created below. 2414s # Cleanup happens either in self.__exit__ or at the end of the iterator 2414s # returned by read_sql when chunksize is not None. 2414s self.exit_stack = ExitStack() 2414s if isinstance(con, str): 2414s con = create_engine(con) 2414s self.exit_stack.callback(con.dispose) 2414s if isinstance(con, Engine): 2414s > con = self.exit_stack.enter_context(con.connect()) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_dtype[read_sql-_NoDefault.no_default-postgresql_psycopg2_conn] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = 'postgresql_psycopg2_conn' 2414s request = > 2414s func = 'read_sql', dtype_backend = 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3757: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s ______ test_read_sql_dtype[read_sql-numpy_nullable-mysql_pymysql_engine] _______ 2414s conn = 'mysql_pymysql_engine' 2414s request = > 2414s func = 'read_sql', dtype_backend = 'numpy_nullable' 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3757: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bd9d0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bd9d0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _______ test_read_sql_dtype[read_sql-numpy_nullable-mysql_pymysql_conn] ________ 2414s conn = 'mysql_pymysql_conn' 2414s request = > 2414s func = 'read_sql', dtype_backend = 'numpy_nullable' 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3757: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s > fixturedef = request._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bdaf0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bdaf0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s ___ test_read_sql_dtype[read_sql-numpy_nullable-postgresql_psycopg2_engine] ____ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s request = > 2414s func = 'read_sql', dtype_backend = 'numpy_nullable' 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s conn = request.getfixturevalue(conn) 2414s table = "test" 2414s df = DataFrame({"a": [1, 2, 3], "b": 5}) 2414s > df.to_sql(name=table, con=conn, index=False, if_exists="replace") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3760: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ( a b 2414s 0 1 5 2414s 1 2 5 2414s 2 3 5,) 2414s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'replace', 'index': False, 'name': 'test'} 2414s 2414s @wraps(func) 2414s def wrapper(*args, **kwargs): 2414s if len(args) > num_allow_args: 2414s warnings.warn( 2414s msg.format(arguments=_format_argument_list(allow_args)), 2414s FutureWarning, 2414s stacklevel=find_stack_level(), 2414s ) 2414s > return func(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = a b 2414s 0 1 5 2414s 1 2 5 2414s 2 3 5, name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None 2414s 2414s @final 2414s @deprecate_nonkeyword_arguments( 2414s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2414s ) 2414s def to_sql( 2414s self, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool_t = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2414s newly created, appended to, or overwritten. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s Name of SQL table. 2414s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. Legacy support is provided for sqlite3.Connection objects. The user 2414s is responsible for engine disposal and connection closure for the SQLAlchemy 2414s connectable. See `here \ 2414s `_. 2414s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2414s the transaction will not be committed. If passing a sqlite3.Connection, 2414s it will not be possible to roll back the record insertion. 2414s 2414s schema : str, optional 2414s Specify the schema (if database flavor supports this). If None, use 2414s default schema. 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s How to behave if the table already exists. 2414s 2414s * fail: Raise a ValueError. 2414s * replace: Drop the table before inserting new values. 2414s * append: Insert new values to the existing table. 2414s 2414s index : bool, default True 2414s Write DataFrame index as a column. Uses `index_label` as the column 2414s name in the table. Creates a table index for this column. 2414s index_label : str or sequence, default None 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s * None : Uses standard SQL ``INSERT`` clause (one per row). 2414s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2414s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s The number of returned rows affected is the sum of the ``rowcount`` 2414s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2414s reflect the exact number of written rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Raises 2414s ------ 2414s ValueError 2414s When the table already exists and `if_exists` is 'fail' (the 2414s default). 2414s 2414s See Also 2414s -------- 2414s read_sql : Read a DataFrame from a table. 2414s 2414s Notes 2414s ----- 2414s Timezone aware datetime columns will be written as 2414s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2414s database. Otherwise, the datetimes will be stored as timezone unaware 2414s timestamps local to the original timezone. 2414s 2414s Not all datastores support ``method="multi"``. Oracle, for example, 2414s does not support multi-value insert. 2414s 2414s References 2414s ---------- 2414s .. [1] https://docs.sqlalchemy.org 2414s .. [2] https://www.python.org/dev/peps/pep-0249/ 2414s 2414s Examples 2414s -------- 2414s Create an in-memory SQLite database. 2414s 2414s >>> from sqlalchemy import create_engine 2414s >>> engine = create_engine('sqlite://', echo=False) 2414s 2414s Create a table from scratch with 3 rows. 2414s 2414s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2414s >>> df 2414s name 2414s 0 User 1 2414s 1 User 2 2414s 2 User 3 2414s 2414s >>> df.to_sql(name='users', con=engine) 2414s 3 2414s >>> from sqlalchemy import text 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2414s 2414s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2414s 2414s >>> with engine.begin() as connection: 2414s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2414s ... df1.to_sql(name='users', con=connection, if_exists='append') 2414s 2 2414s 2414s This is allowed to support operations that require that the same 2414s DBAPI connection is used for the entire operation. 2414s 2414s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2414s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2414s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2414s (1, 'User 7')] 2414s 2414s Overwrite the table with just ``df2``. 2414s 2414s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2414s ... index_label='id') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 6'), (1, 'User 7')] 2414s 2414s Use ``method`` to define a callable insertion method to do nothing 2414s if there's a primary key conflict on a table in a PostgreSQL database. 2414s 2414s >>> from sqlalchemy.dialects.postgresql import insert 2414s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2414s ... # "a" is the primary key in "conflict_table" 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2414s 0 2414s 2414s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2414s on a primary key. 2414s 2414s >>> from sqlalchemy.dialects.mysql import insert 2414s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2414s ... # update columns "b" and "c" on primary key conflict 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = ( 2414s ... insert(table.table) 2414s ... .values(data) 2414s ... ) 2414s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2414s 2 2414s 2414s Specify the dtype (especially useful for integers with missing values). 2414s Notice that while pandas is forced to store the data as floating point, 2414s the database supports nullable integers. When fetching the data with 2414s Python, we get back integer scalars. 2414s 2414s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2414s >>> df 2414s A 2414s 0 1.0 2414s 1 NaN 2414s 2 2.0 2414s 2414s >>> from sqlalchemy.types import Integer 2414s >>> df.to_sql(name='integers', con=engine, index=False, 2414s ... dtype={"A": Integer()}) 2414s 3 2414s 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2414s [(1,), (None,), (2,)] 2414s """ # noqa: E501 2414s from pandas.io import sql 2414s 2414s > return sql.to_sql( 2414s self, 2414s name, 2414s con, 2414s schema=schema, 2414s if_exists=if_exists, 2414s index=index, 2414s index_label=index_label, 2414s chunksize=chunksize, 2414s dtype=dtype, 2414s method=method, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s frame = a b 2414s 0 1 5 2414s 1 2 5 2414s 2 3 5, name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None, engine = 'auto' 2414s engine_kwargs = {} 2414s 2414s def to_sql( 2414s frame, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s engine: str = "auto", 2414s **engine_kwargs, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Parameters 2414s ---------- 2414s frame : DataFrame, Series 2414s name : str 2414s Name of SQL table. 2414s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2414s or sqlite3 DBAPI2 connection 2414s ADBC provides high performance I/O with native type support, where available. 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. 2414s If a DBAPI2 object, only sqlite3 is supported. 2414s schema : str, optional 2414s Name of SQL schema in database to write to (if database flavor 2414s supports this). If None, use default schema (default). 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s - fail: If table exists, do nothing. 2414s - replace: If table exists, drop it, recreate it, and insert data. 2414s - append: If table exists, insert data. Create if does not exist. 2414s index : bool, default True 2414s Write DataFrame index as a column. 2414s index_label : str or sequence, optional 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s - None : Uses standard SQL ``INSERT`` clause (one per row). 2414s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2414s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s engine : {'auto', 'sqlalchemy'}, default 'auto' 2414s SQL engine library to use. If 'auto', then the option 2414s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2414s behavior is 'sqlalchemy' 2414s 2414s .. versionadded:: 1.3.0 2414s 2414s **engine_kwargs 2414s Any additional kwargs are passed to the engine. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Notes 2414s ----- 2414s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2414s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2414s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2414s rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__ 2414s """ # noqa: E501 2414s if if_exists not in ("fail", "replace", "append"): 2414s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2414s 2414s if isinstance(frame, Series): 2414s frame = frame.to_frame() 2414s elif not isinstance(frame, DataFrame): 2414s raise NotImplementedError( 2414s "'frame' argument should be either a Series or a DataFrame" 2414s ) 2414s 2414s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def pandasSQL_builder( 2414s con, 2414s schema: str | None = None, 2414s need_transaction: bool = False, 2414s ) -> PandasSQL: 2414s """ 2414s Convenience function to return the correct PandasSQL subclass based on the 2414s provided parameters. Also creates a sqlalchemy connection and transaction 2414s if necessary. 2414s """ 2414s import sqlite3 2414s 2414s if isinstance(con, sqlite3.Connection) or con is None: 2414s return SQLiteDatabase(con) 2414s 2414s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2414s 2414s if isinstance(con, str) and sqlalchemy is None: 2414s raise ImportError("Using URI string without sqlalchemy installed.") 2414s 2414s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2414s > return SQLDatabase(con, schema, need_transaction) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def __init__( 2414s self, con, schema: str | None = None, need_transaction: bool = False 2414s ) -> None: 2414s from sqlalchemy import create_engine 2414s from sqlalchemy.engine import Engine 2414s from sqlalchemy.schema import MetaData 2414s 2414s # self.exit_stack cleans up the Engine and Connection and commits the 2414s # transaction if any of those objects was created below. 2414s # Cleanup happens either in self.__exit__ or at the end of the iterator 2414s # returned by read_sql when chunksize is not None. 2414s self.exit_stack = ExitStack() 2414s if isinstance(con, str): 2414s con = create_engine(con) 2414s self.exit_stack.callback(con.dispose) 2414s if isinstance(con, Engine): 2414s > con = self.exit_stack.enter_context(con.connect()) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s ____ test_read_sql_dtype[read_sql-numpy_nullable-postgresql_psycopg2_conn] _____ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = 'postgresql_psycopg2_conn' 2414s request = > 2414s func = 'read_sql', dtype_backend = 'numpy_nullable' 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3757: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_dtype[read_sql_query-_NoDefault.no_default-mysql_pymysql_engine] _ 2414s conn = 'mysql_pymysql_engine' 2414s request = > 2414s func = 'read_sql_query', dtype_backend = 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3757: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0be3f0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0be3f0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_dtype[read_sql_query-_NoDefault.no_default-mysql_pymysql_conn] _ 2414s conn = 'mysql_pymysql_conn' 2414s request = > 2414s func = 'read_sql_query', dtype_backend = 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3757: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s > fixturedef = request._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0be4b0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0be4b0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_dtype[read_sql_query-_NoDefault.no_default-postgresql_psycopg2_engine] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s request = > 2414s func = 'read_sql_query', dtype_backend = 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s conn = request.getfixturevalue(conn) 2414s table = "test" 2414s df = DataFrame({"a": [1, 2, 3], "b": 5}) 2414s > df.to_sql(name=table, con=conn, index=False, if_exists="replace") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3760: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ( a b 2414s 0 1 5 2414s 1 2 5 2414s 2 3 5,) 2414s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'replace', 'index': False, 'name': 'test'} 2414s 2414s @wraps(func) 2414s def wrapper(*args, **kwargs): 2414s if len(args) > num_allow_args: 2414s warnings.warn( 2414s msg.format(arguments=_format_argument_list(allow_args)), 2414s FutureWarning, 2414s stacklevel=find_stack_level(), 2414s ) 2414s > return func(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = a b 2414s 0 1 5 2414s 1 2 5 2414s 2 3 5, name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None 2414s 2414s @final 2414s @deprecate_nonkeyword_arguments( 2414s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2414s ) 2414s def to_sql( 2414s self, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool_t = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2414s newly created, appended to, or overwritten. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s Name of SQL table. 2414s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. Legacy support is provided for sqlite3.Connection objects. The user 2414s is responsible for engine disposal and connection closure for the SQLAlchemy 2414s connectable. See `here \ 2414s `_. 2414s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2414s the transaction will not be committed. If passing a sqlite3.Connection, 2414s it will not be possible to roll back the record insertion. 2414s 2414s schema : str, optional 2414s Specify the schema (if database flavor supports this). If None, use 2414s default schema. 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s How to behave if the table already exists. 2414s 2414s * fail: Raise a ValueError. 2414s * replace: Drop the table before inserting new values. 2414s * append: Insert new values to the existing table. 2414s 2414s index : bool, default True 2414s Write DataFrame index as a column. Uses `index_label` as the column 2414s name in the table. Creates a table index for this column. 2414s index_label : str or sequence, default None 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s * None : Uses standard SQL ``INSERT`` clause (one per row). 2414s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2414s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s The number of returned rows affected is the sum of the ``rowcount`` 2414s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2414s reflect the exact number of written rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Raises 2414s ------ 2414s ValueError 2414s When the table already exists and `if_exists` is 'fail' (the 2414s default). 2414s 2414s See Also 2414s -------- 2414s read_sql : Read a DataFrame from a table. 2414s 2414s Notes 2414s ----- 2414s Timezone aware datetime columns will be written as 2414s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2414s database. Otherwise, the datetimes will be stored as timezone unaware 2414s timestamps local to the original timezone. 2414s 2414s Not all datastores support ``method="multi"``. Oracle, for example, 2414s does not support multi-value insert. 2414s 2414s References 2414s ---------- 2414s .. [1] https://docs.sqlalchemy.org 2414s .. [2] https://www.python.org/dev/peps/pep-0249/ 2414s 2414s Examples 2414s -------- 2414s Create an in-memory SQLite database. 2414s 2414s >>> from sqlalchemy import create_engine 2414s >>> engine = create_engine('sqlite://', echo=False) 2414s 2414s Create a table from scratch with 3 rows. 2414s 2414s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2414s >>> df 2414s name 2414s 0 User 1 2414s 1 User 2 2414s 2 User 3 2414s 2414s >>> df.to_sql(name='users', con=engine) 2414s 3 2414s >>> from sqlalchemy import text 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2414s 2414s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2414s 2414s >>> with engine.begin() as connection: 2414s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2414s ... df1.to_sql(name='users', con=connection, if_exists='append') 2414s 2 2414s 2414s This is allowed to support operations that require that the same 2414s DBAPI connection is used for the entire operation. 2414s 2414s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2414s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2414s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2414s (1, 'User 7')] 2414s 2414s Overwrite the table with just ``df2``. 2414s 2414s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2414s ... index_label='id') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 6'), (1, 'User 7')] 2414s 2414s Use ``method`` to define a callable insertion method to do nothing 2414s if there's a primary key conflict on a table in a PostgreSQL database. 2414s 2414s >>> from sqlalchemy.dialects.postgresql import insert 2414s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2414s ... # "a" is the primary key in "conflict_table" 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2414s 0 2414s 2414s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2414s on a primary key. 2414s 2414s >>> from sqlalchemy.dialects.mysql import insert 2414s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2414s ... # update columns "b" and "c" on primary key conflict 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = ( 2414s ... insert(table.table) 2414s ... .values(data) 2414s ... ) 2414s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2414s 2 2414s 2414s Specify the dtype (especially useful for integers with missing values). 2414s Notice that while pandas is forced to store the data as floating point, 2414s the database supports nullable integers. When fetching the data with 2414s Python, we get back integer scalars. 2414s 2414s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2414s >>> df 2414s A 2414s 0 1.0 2414s 1 NaN 2414s 2 2.0 2414s 2414s >>> from sqlalchemy.types import Integer 2414s >>> df.to_sql(name='integers', con=engine, index=False, 2414s ... dtype={"A": Integer()}) 2414s 3 2414s 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2414s [(1,), (None,), (2,)] 2414s """ # noqa: E501 2414s from pandas.io import sql 2414s 2414s > return sql.to_sql( 2414s self, 2414s name, 2414s con, 2414s schema=schema, 2414s if_exists=if_exists, 2414s index=index, 2414s index_label=index_label, 2414s chunksize=chunksize, 2414s dtype=dtype, 2414s method=method, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s frame = a b 2414s 0 1 5 2414s 1 2 5 2414s 2 3 5, name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None, engine = 'auto' 2414s engine_kwargs = {} 2414s 2414s def to_sql( 2414s frame, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s engine: str = "auto", 2414s **engine_kwargs, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Parameters 2414s ---------- 2414s frame : DataFrame, Series 2414s name : str 2414s Name of SQL table. 2414s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2414s or sqlite3 DBAPI2 connection 2414s ADBC provides high performance I/O with native type support, where available. 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. 2414s If a DBAPI2 object, only sqlite3 is supported. 2414s schema : str, optional 2414s Name of SQL schema in database to write to (if database flavor 2414s supports this). If None, use default schema (default). 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s - fail: If table exists, do nothing. 2414s - replace: If table exists, drop it, recreate it, and insert data. 2414s - append: If table exists, insert data. Create if does not exist. 2414s index : bool, default True 2414s Write DataFrame index as a column. 2414s index_label : str or sequence, optional 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s - None : Uses standard SQL ``INSERT`` clause (one per row). 2414s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2414s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s engine : {'auto', 'sqlalchemy'}, default 'auto' 2414s SQL engine library to use. If 'auto', then the option 2414s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2414s behavior is 'sqlalchemy' 2414s 2414s .. versionadded:: 1.3.0 2414s 2414s **engine_kwargs 2414s Any additional kwargs are passed to the engine. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Notes 2414s ----- 2414s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2414s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2414s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2414s rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__ 2414s """ # noqa: E501 2414s if if_exists not in ("fail", "replace", "append"): 2414s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2414s 2414s if isinstance(frame, Series): 2414s frame = frame.to_frame() 2414s elif not isinstance(frame, DataFrame): 2414s raise NotImplementedError( 2414s "'frame' argument should be either a Series or a DataFrame" 2414s ) 2414s 2414s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def pandasSQL_builder( 2414s con, 2414s schema: str | None = None, 2414s need_transaction: bool = False, 2414s ) -> PandasSQL: 2414s """ 2414s Convenience function to return the correct PandasSQL subclass based on the 2414s provided parameters. Also creates a sqlalchemy connection and transaction 2414s if necessary. 2414s """ 2414s import sqlite3 2414s 2414s if isinstance(con, sqlite3.Connection) or con is None: 2414s return SQLiteDatabase(con) 2414s 2414s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2414s 2414s if isinstance(con, str) and sqlalchemy is None: 2414s raise ImportError("Using URI string without sqlalchemy installed.") 2414s 2414s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2414s > return SQLDatabase(con, schema, need_transaction) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def __init__( 2414s self, con, schema: str | None = None, need_transaction: bool = False 2414s ) -> None: 2414s from sqlalchemy import create_engine 2414s from sqlalchemy.engine import Engine 2414s from sqlalchemy.schema import MetaData 2414s 2414s # self.exit_stack cleans up the Engine and Connection and commits the 2414s # transaction if any of those objects was created below. 2414s # Cleanup happens either in self.__exit__ or at the end of the iterator 2414s # returned by read_sql when chunksize is not None. 2414s self.exit_stack = ExitStack() 2414s if isinstance(con, str): 2414s con = create_engine(con) 2414s self.exit_stack.callback(con.dispose) 2414s if isinstance(con, Engine): 2414s > con = self.exit_stack.enter_context(con.connect()) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_dtype[read_sql_query-_NoDefault.no_default-postgresql_psycopg2_conn] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = 'postgresql_psycopg2_conn' 2414s request = > 2414s func = 'read_sql_query', dtype_backend = 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3757: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s ___ test_read_sql_dtype[read_sql_query-numpy_nullable-mysql_pymysql_engine] ____ 2414s conn = 'mysql_pymysql_engine' 2414s request = > 2414s func = 'read_sql_query', dtype_backend = 'numpy_nullable' 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3757: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bedb0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0bedb0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s ____ test_read_sql_dtype[read_sql_query-numpy_nullable-mysql_pymysql_conn] _____ 2414s conn = 'mysql_pymysql_conn' 2414s request = > 2414s func = 'read_sql_query', dtype_backend = 'numpy_nullable' 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3757: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s > fixturedef = request._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1047: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'mysql_pymysql_engine' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s @pytest.fixture 2414s def mysql_pymysql_engine(): 2414s sqlalchemy = td.versioned_importorskip("sqlalchemy") 2414s > pymysql = td.versioned_importorskip("pymysql") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:605: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ('pymysql',), kwargs = {} 2414s 2414s def versioned_importorskip(*args, **kwargs): 2414s """ 2414s (warning - this is currently Debian-specific, the name may change if upstream request this) 2414s 2414s Return the requested module, or skip the test if it is 2414s not available in a new enough version. 2414s 2414s Intended as a replacement for pytest.importorskip that 2414s defaults to requiring at least pandas' minimum version for that 2414s optional dependency, rather than any version. 2414s 2414s See import_optional_dependency for full parameter documentation. 2414s """ 2414s try: 2414s > module = import_optional_dependency(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_test_decorators.py:189: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', extra = '', errors = 'raise', min_version = None 2414s 2414s def import_optional_dependency( 2414s name: str, 2414s extra: str = "", 2414s errors: str = "raise", 2414s min_version: str | None = None, 2414s ): 2414s """ 2414s Import an optional dependency. 2414s 2414s By default, if a dependency is missing an ImportError with a nice 2414s message will be raised. If a dependency is present, but too old, 2414s we raise. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s The module name. 2414s extra : str 2414s Additional text to include in the ImportError message. 2414s errors : str {'raise', 'warn', 'ignore'} 2414s What to do when a dependency is not found or its version is too old. 2414s 2414s * raise : Raise an ImportError 2414s * warn : Only applicable when a module's version is to old. 2414s Warns that the version is too old and returns None 2414s * ignore: If the module is not installed, return None, otherwise, 2414s return the module, even if the version is too old. 2414s It's expected that users validate the version locally when 2414s using ``errors="ignore"`` (see. ``io/html.py``) 2414s min_version : str, default None 2414s Specify a minimum version that is different from the global pandas 2414s minimum version required. 2414s Returns 2414s ------- 2414s maybe_module : Optional[ModuleType] 2414s The imported module, when found and the version is correct. 2414s None is returned when the package is not found and `errors` 2414s is False, or when the package's version is too old and `errors` 2414s is ``'warn'`` or ``'ignore'``. 2414s """ 2414s assert errors in {"warn", "raise", "ignore"} 2414s if name=='numba' and warn_numba_platform: 2414s warnings.warn(warn_numba_platform) 2414s 2414s package_name = INSTALL_MAPPING.get(name) 2414s install_name = package_name if package_name is not None else name 2414s 2414s msg = ( 2414s f"Missing optional dependency '{install_name}'. {extra} " 2414s f"Use pip or conda to install {install_name}." 2414s ) 2414s try: 2414s > module = importlib.import_module(name) 2414s 2414s /usr/lib/python3/dist-packages/pandas/compat/_optional.py:140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None 2414s 2414s def import_module(name, package=None): 2414s """Import a module. 2414s 2414s The 'package' argument is required when performing a relative import. It 2414s specifies the package to use as the anchor point from which to resolve the 2414s relative import to an absolute import. 2414s 2414s """ 2414s level = 0 2414s if name.startswith('.'): 2414s if not package: 2414s raise TypeError("the 'package' argument is required to perform a " 2414s f"relative import for {name!r}") 2414s for character in name: 2414s if character != '.': 2414s break 2414s level += 1 2414s > return _bootstrap._gcd_import(name[level:], package, level) 2414s 2414s /usr/lib/python3.13/importlib/__init__.py:88: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', package = None, level = 0 2414s 2414s > ??? 2414s 2414s :1387: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1360: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s name = 'pymysql', import_ = 2414s 2414s > ??? 2414s 2414s :1331: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s spec = ModuleSpec(name='pymysql', loader=<_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0beed0>, origin='/usr/lib/python3/dist-packages/pymysql/__init__.py', submodule_search_locations=['/usr/lib/python3/dist-packages/pymysql']) 2414s 2414s > ??? 2414s 2414s :935: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_frozen_importlib_external.SourceFileLoader object at 0x79e9ea0beed0> 2414s module = 2414s 2414s > ??? 2414s 2414s :1022: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s f = 2414s args = ( at 0x3fed6c80, file "/usr/lib/python3/dist-packages/pymysql/__init__.py", line 1>, {'DataError'...rr.DatabaseError'>, 'Date': , 'DateFromTicks': , ...}) 2414s kwds = {} 2414s 2414s > ??? 2414s 2414s :488: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s PyMySQL: A pure-Python MySQL client library. 2414s 2414s Copyright (c) 2010-2016 PyMySQL contributors 2414s 2414s Permission is hereby granted, free of charge, to any person obtaining a copy 2414s of this software and associated documentation files (the "Software"), to deal 2414s in the Software without restriction, including without limitation the rights 2414s to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 2414s copies of the Software, and to permit persons to whom the Software is 2414s furnished to do so, subject to the following conditions: 2414s 2414s The above copyright notice and this permission notice shall be included in 2414s all copies or substantial portions of the Software. 2414s 2414s THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 2414s IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 2414s FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 2414s AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 2414s LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 2414s OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 2414s THE SOFTWARE. 2414s """ 2414s 2414s import sys 2414s 2414s from .constants import FIELD_TYPE 2414s from .err import ( 2414s Warning, 2414s Error, 2414s InterfaceError, 2414s DataError, 2414s DatabaseError, 2414s OperationalError, 2414s IntegrityError, 2414s InternalError, 2414s NotSupportedError, 2414s ProgrammingError, 2414s MySQLError, 2414s ) 2414s from .times import ( 2414s Date, 2414s Time, 2414s Timestamp, 2414s DateFromTicks, 2414s TimeFromTicks, 2414s TimestampFromTicks, 2414s ) 2414s 2414s # PyMySQL version. 2414s # Used by setuptools and connection_attrs 2414s VERSION = (1, 1, 1, "final", 1) 2414s VERSION_STRING = "1.1.1" 2414s 2414s ### for mysqlclient compatibility 2414s ### Django checks mysqlclient version. 2414s version_info = (1, 4, 6, "final", 1) 2414s __version__ = "1.4.6" 2414s 2414s 2414s def get_client_info(): # for MySQLdb compatibility 2414s return __version__ 2414s 2414s 2414s def install_as_MySQLdb(): 2414s """ 2414s After this function is called, any application that imports MySQLdb 2414s will unwittingly actually use pymysql. 2414s """ 2414s sys.modules["MySQLdb"] = sys.modules["pymysql"] 2414s 2414s 2414s # end of mysqlclient compatibility code 2414s 2414s threadsafety = 1 2414s apilevel = "2.0" 2414s paramstyle = "pyformat" 2414s 2414s > from . import connections # noqa: E402 2414s 2414s /usr/lib/python3/dist-packages/pymysql/__init__.py:79: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # Python implementation of the MySQL client-server protocol 2414s # http://dev.mysql.com/doc/internals/en/client-server-protocol.html 2414s # Error codes: 2414s # https://dev.mysql.com/doc/refman/5.5/en/error-handling.html 2414s import errno 2414s import os 2414s import socket 2414s import struct 2414s import sys 2414s import traceback 2414s import warnings 2414s 2414s > from . import _auth 2414s 2414s /usr/lib/python3/dist-packages/pymysql/connections.py:13: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s """ 2414s Implements auth methods 2414s """ 2414s 2414s from .err import OperationalError 2414s 2414s 2414s try: 2414s from cryptography.hazmat.backends import default_backend 2414s > from cryptography.hazmat.primitives import serialization, hashes 2414s 2414s /usr/lib/python3/dist-packages/pymysql/_auth.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s > from cryptography.hazmat.primitives._serialization import ( 2414s BestAvailableEncryption, 2414s Encoding, 2414s KeySerializationEncryption, 2414s NoEncryption, 2414s ParameterFormat, 2414s PrivateFormat, 2414s PublicFormat, 2414s _KeySerializationEncryption, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/serialization/__init__.py:7: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography import utils 2414s > from cryptography.hazmat.primitives.hashes import HashAlgorithm 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/_serialization.py:10: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s # This file is dual licensed under the terms of the Apache License, Version 2414s # 2.0, and the BSD License. See the LICENSE file in the root of this repository 2414s # for complete details. 2414s 2414s from __future__ import annotations 2414s 2414s import abc 2414s 2414s from cryptography.hazmat.bindings._rust import openssl as rust_openssl 2414s 2414s __all__ = [ 2414s "HashAlgorithm", 2414s "HashContext", 2414s "Hash", 2414s "ExtendableOutputFunction", 2414s "SHA1", 2414s "SHA512_224", 2414s "SHA512_256", 2414s "SHA224", 2414s "SHA256", 2414s "SHA384", 2414s "SHA512", 2414s "SHA3_224", 2414s "SHA3_256", 2414s "SHA3_384", 2414s "SHA3_512", 2414s "SHAKE128", 2414s "SHAKE256", 2414s "MD5", 2414s "BLAKE2b", 2414s "BLAKE2s", 2414s "SM3", 2414s ] 2414s 2414s 2414s class HashAlgorithm(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def name(self) -> str: 2414s """ 2414s A string naming this algorithm (e.g. "sha256", "md5"). 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def digest_size(self) -> int: 2414s """ 2414s The size of the resulting digest in bytes. 2414s """ 2414s 2414s @property 2414s @abc.abstractmethod 2414s def block_size(self) -> int | None: 2414s """ 2414s The internal block size of the hash function, or None if the hash 2414s function does not use blocks internally (e.g. SHA3). 2414s """ 2414s 2414s 2414s class HashContext(metaclass=abc.ABCMeta): 2414s @property 2414s @abc.abstractmethod 2414s def algorithm(self) -> HashAlgorithm: 2414s """ 2414s A HashAlgorithm that will be used by this context. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def update(self, data: bytes) -> None: 2414s """ 2414s Processes the provided bytes through the hash. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def finalize(self) -> bytes: 2414s """ 2414s Finalizes the hash context and returns the hash digest as bytes. 2414s """ 2414s 2414s @abc.abstractmethod 2414s def copy(self) -> HashContext: 2414s """ 2414s Return a HashContext that is a copy of the current context. 2414s """ 2414s 2414s 2414s > Hash = rust_openssl.hashes.Hash 2414s E AttributeError: module 'cryptography.hazmat.bindings._rust.openssl' has no attribute 'hashes' 2414s 2414s /usr/lib/python3/dist-packages/cryptography/hazmat/primitives/hashes.py:87: AttributeError 2414s _ test_read_sql_dtype[read_sql_query-numpy_nullable-postgresql_psycopg2_engine] _ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s request = > 2414s func = 'read_sql_query', dtype_backend = 'numpy_nullable' 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s conn = request.getfixturevalue(conn) 2414s table = "test" 2414s df = DataFrame({"a": [1, 2, 3], "b": 5}) 2414s > df.to_sql(name=table, con=conn, index=False, if_exists="replace") 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3760: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s args = ( a b 2414s 0 1 5 2414s 1 2 5 2414s 2 3 5,) 2414s kwargs = {'con': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas), 'if_exists': 'replace', 'index': False, 'name': 'test'} 2414s 2414s @wraps(func) 2414s def wrapper(*args, **kwargs): 2414s if len(args) > num_allow_args: 2414s warnings.warn( 2414s msg.format(arguments=_format_argument_list(allow_args)), 2414s FutureWarning, 2414s stacklevel=find_stack_level(), 2414s ) 2414s > return func(*args, **kwargs) 2414s 2414s /usr/lib/python3/dist-packages/pandas/util/_decorators.py:333: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = a b 2414s 0 1 5 2414s 1 2 5 2414s 2 3 5, name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None 2414s 2414s @final 2414s @deprecate_nonkeyword_arguments( 2414s version="3.0", allowed_args=["self", "name", "con"], name="to_sql" 2414s ) 2414s def to_sql( 2414s self, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool_t = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Databases supported by SQLAlchemy [1]_ are supported. Tables can be 2414s newly created, appended to, or overwritten. 2414s 2414s Parameters 2414s ---------- 2414s name : str 2414s Name of SQL table. 2414s con : sqlalchemy.engine.(Engine or Connection) or sqlite3.Connection 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. Legacy support is provided for sqlite3.Connection objects. The user 2414s is responsible for engine disposal and connection closure for the SQLAlchemy 2414s connectable. See `here \ 2414s `_. 2414s If passing a sqlalchemy.engine.Connection which is already in a transaction, 2414s the transaction will not be committed. If passing a sqlite3.Connection, 2414s it will not be possible to roll back the record insertion. 2414s 2414s schema : str, optional 2414s Specify the schema (if database flavor supports this). If None, use 2414s default schema. 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s How to behave if the table already exists. 2414s 2414s * fail: Raise a ValueError. 2414s * replace: Drop the table before inserting new values. 2414s * append: Insert new values to the existing table. 2414s 2414s index : bool, default True 2414s Write DataFrame index as a column. Uses `index_label` as the column 2414s name in the table. Creates a table index for this column. 2414s index_label : str or sequence, default None 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 legacy mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s * None : Uses standard SQL ``INSERT`` clause (one per row). 2414s * 'multi': Pass multiple values in a single ``INSERT`` clause. 2414s * callable with signature ``(pd_table, conn, keys, data_iter)``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s The number of returned rows affected is the sum of the ``rowcount`` 2414s attribute of ``sqlite3.Cursor`` or SQLAlchemy connectable which may not 2414s reflect the exact number of written rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Raises 2414s ------ 2414s ValueError 2414s When the table already exists and `if_exists` is 'fail' (the 2414s default). 2414s 2414s See Also 2414s -------- 2414s read_sql : Read a DataFrame from a table. 2414s 2414s Notes 2414s ----- 2414s Timezone aware datetime columns will be written as 2414s ``Timestamp with timezone`` type with SQLAlchemy if supported by the 2414s database. Otherwise, the datetimes will be stored as timezone unaware 2414s timestamps local to the original timezone. 2414s 2414s Not all datastores support ``method="multi"``. Oracle, for example, 2414s does not support multi-value insert. 2414s 2414s References 2414s ---------- 2414s .. [1] https://docs.sqlalchemy.org 2414s .. [2] https://www.python.org/dev/peps/pep-0249/ 2414s 2414s Examples 2414s -------- 2414s Create an in-memory SQLite database. 2414s 2414s >>> from sqlalchemy import create_engine 2414s >>> engine = create_engine('sqlite://', echo=False) 2414s 2414s Create a table from scratch with 3 rows. 2414s 2414s >>> df = pd.DataFrame({'name' : ['User 1', 'User 2', 'User 3']}) 2414s >>> df 2414s name 2414s 0 User 1 2414s 1 User 2 2414s 2 User 3 2414s 2414s >>> df.to_sql(name='users', con=engine) 2414s 3 2414s >>> from sqlalchemy import text 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3')] 2414s 2414s An `sqlalchemy.engine.Connection` can also be passed to `con`: 2414s 2414s >>> with engine.begin() as connection: 2414s ... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']}) 2414s ... df1.to_sql(name='users', con=connection, if_exists='append') 2414s 2 2414s 2414s This is allowed to support operations that require that the same 2414s DBAPI connection is used for the entire operation. 2414s 2414s >>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']}) 2414s >>> df2.to_sql(name='users', con=engine, if_exists='append') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 1'), (1, 'User 2'), (2, 'User 3'), 2414s (0, 'User 4'), (1, 'User 5'), (0, 'User 6'), 2414s (1, 'User 7')] 2414s 2414s Overwrite the table with just ``df2``. 2414s 2414s >>> df2.to_sql(name='users', con=engine, if_exists='replace', 2414s ... index_label='id') 2414s 2 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM users")).fetchall() 2414s [(0, 'User 6'), (1, 'User 7')] 2414s 2414s Use ``method`` to define a callable insertion method to do nothing 2414s if there's a primary key conflict on a table in a PostgreSQL database. 2414s 2414s >>> from sqlalchemy.dialects.postgresql import insert 2414s >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): 2414s ... # "a" is the primary key in "conflict_table" 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"]) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP 2414s 0 2414s 2414s For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict 2414s on a primary key. 2414s 2414s >>> from sqlalchemy.dialects.mysql import insert 2414s >>> def insert_on_conflict_update(table, conn, keys, data_iter): 2414s ... # update columns "b" and "c" on primary key conflict 2414s ... data = [dict(zip(keys, row)) for row in data_iter] 2414s ... stmt = ( 2414s ... insert(table.table) 2414s ... .values(data) 2414s ... ) 2414s ... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c) 2414s ... result = conn.execute(stmt) 2414s ... return result.rowcount 2414s >>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP 2414s 2 2414s 2414s Specify the dtype (especially useful for integers with missing values). 2414s Notice that while pandas is forced to store the data as floating point, 2414s the database supports nullable integers. When fetching the data with 2414s Python, we get back integer scalars. 2414s 2414s >>> df = pd.DataFrame({"A": [1, None, 2]}) 2414s >>> df 2414s A 2414s 0 1.0 2414s 1 NaN 2414s 2 2.0 2414s 2414s >>> from sqlalchemy.types import Integer 2414s >>> df.to_sql(name='integers', con=engine, index=False, 2414s ... dtype={"A": Integer()}) 2414s 3 2414s 2414s >>> with engine.connect() as conn: 2414s ... conn.execute(text("SELECT * FROM integers")).fetchall() 2414s [(1,), (None,), (2,)] 2414s """ # noqa: E501 2414s from pandas.io import sql 2414s 2414s > return sql.to_sql( 2414s self, 2414s name, 2414s con, 2414s schema=schema, 2414s if_exists=if_exists, 2414s index=index, 2414s index_label=index_label, 2414s chunksize=chunksize, 2414s dtype=dtype, 2414s method=method, 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/pandas/core/generic.py:3087: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s frame = a b 2414s 0 1 5 2414s 1 2 5 2414s 2 3 5, name = 'test' 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, if_exists = 'replace', index = False, index_label = None 2414s chunksize = None, dtype = None, method = None, engine = 'auto' 2414s engine_kwargs = {} 2414s 2414s def to_sql( 2414s frame, 2414s name: str, 2414s con, 2414s schema: str | None = None, 2414s if_exists: Literal["fail", "replace", "append"] = "fail", 2414s index: bool = True, 2414s index_label: IndexLabel | None = None, 2414s chunksize: int | None = None, 2414s dtype: DtypeArg | None = None, 2414s method: Literal["multi"] | Callable | None = None, 2414s engine: str = "auto", 2414s **engine_kwargs, 2414s ) -> int | None: 2414s """ 2414s Write records stored in a DataFrame to a SQL database. 2414s 2414s Parameters 2414s ---------- 2414s frame : DataFrame, Series 2414s name : str 2414s Name of SQL table. 2414s con : ADBC Connection, SQLAlchemy connectable, str, or sqlite3 connection 2414s or sqlite3 DBAPI2 connection 2414s ADBC provides high performance I/O with native type support, where available. 2414s Using SQLAlchemy makes it possible to use any DB supported by that 2414s library. 2414s If a DBAPI2 object, only sqlite3 is supported. 2414s schema : str, optional 2414s Name of SQL schema in database to write to (if database flavor 2414s supports this). If None, use default schema (default). 2414s if_exists : {'fail', 'replace', 'append'}, default 'fail' 2414s - fail: If table exists, do nothing. 2414s - replace: If table exists, drop it, recreate it, and insert data. 2414s - append: If table exists, insert data. Create if does not exist. 2414s index : bool, default True 2414s Write DataFrame index as a column. 2414s index_label : str or sequence, optional 2414s Column label for index column(s). If None is given (default) and 2414s `index` is True, then the index names are used. 2414s A sequence should be given if the DataFrame uses MultiIndex. 2414s chunksize : int, optional 2414s Specify the number of rows in each batch to be written at a time. 2414s By default, all rows will be written at once. 2414s dtype : dict or scalar, optional 2414s Specifying the datatype for columns. If a dictionary is used, the 2414s keys should be the column names and the values should be the 2414s SQLAlchemy types or strings for the sqlite3 fallback mode. If a 2414s scalar is provided, it will be applied to all columns. 2414s method : {None, 'multi', callable}, optional 2414s Controls the SQL insertion clause used: 2414s 2414s - None : Uses standard SQL ``INSERT`` clause (one per row). 2414s - ``'multi'``: Pass multiple values in a single ``INSERT`` clause. 2414s - callable with signature ``(pd_table, conn, keys, data_iter) -> int | None``. 2414s 2414s Details and a sample callable implementation can be found in the 2414s section :ref:`insert method `. 2414s engine : {'auto', 'sqlalchemy'}, default 'auto' 2414s SQL engine library to use. If 'auto', then the option 2414s ``io.sql.engine`` is used. The default ``io.sql.engine`` 2414s behavior is 'sqlalchemy' 2414s 2414s .. versionadded:: 1.3.0 2414s 2414s **engine_kwargs 2414s Any additional kwargs are passed to the engine. 2414s 2414s Returns 2414s ------- 2414s None or int 2414s Number of rows affected by to_sql. None is returned if the callable 2414s passed into ``method`` does not return an integer number of rows. 2414s 2414s .. versionadded:: 1.4.0 2414s 2414s Notes 2414s ----- 2414s The returned rows affected is the sum of the ``rowcount`` attribute of ``sqlite3.Cursor`` 2414s or SQLAlchemy connectable. If using ADBC the returned rows are the result 2414s of ``Cursor.adbc_ingest``. The returned value may not reflect the exact number of written 2414s rows as stipulated in the 2414s `sqlite3 `__ or 2414s `SQLAlchemy `__ 2414s """ # noqa: E501 2414s if if_exists not in ("fail", "replace", "append"): 2414s raise ValueError(f"'{if_exists}' is not valid for if_exists") 2414s 2414s if isinstance(frame, Series): 2414s frame = frame.to_frame() 2414s elif not isinstance(frame, DataFrame): 2414s raise NotImplementedError( 2414s "'frame' argument should be either a Series or a DataFrame" 2414s ) 2414s 2414s > with pandasSQL_builder(con, schema=schema, need_transaction=True) as pandas_sql: 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:841: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def pandasSQL_builder( 2414s con, 2414s schema: str | None = None, 2414s need_transaction: bool = False, 2414s ) -> PandasSQL: 2414s """ 2414s Convenience function to return the correct PandasSQL subclass based on the 2414s provided parameters. Also creates a sqlalchemy connection and transaction 2414s if necessary. 2414s """ 2414s import sqlite3 2414s 2414s if isinstance(con, sqlite3.Connection) or con is None: 2414s return SQLiteDatabase(con) 2414s 2414s sqlalchemy = import_optional_dependency("sqlalchemy", errors="ignore") 2414s 2414s if isinstance(con, str) and sqlalchemy is None: 2414s raise ImportError("Using URI string without sqlalchemy installed.") 2414s 2414s if sqlalchemy is not None and isinstance(con, (str, sqlalchemy.engine.Connectable)): 2414s > return SQLDatabase(con, schema, need_transaction) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:906: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s con = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s schema = None, need_transaction = True 2414s 2414s def __init__( 2414s self, con, schema: str | None = None, need_transaction: bool = False 2414s ) -> None: 2414s from sqlalchemy import create_engine 2414s from sqlalchemy.engine import Engine 2414s from sqlalchemy.schema import MetaData 2414s 2414s # self.exit_stack cleans up the Engine and Connection and commits the 2414s # transaction if any of those objects was created below. 2414s # Cleanup happens either in self.__exit__ or at the end of the iterator 2414s # returned by read_sql when chunksize is not None. 2414s self.exit_stack = ExitStack() 2414s if isinstance(con, str): 2414s con = create_engine(con) 2414s self.exit_stack.callback(con.dispose) 2414s if isinstance(con, Engine): 2414s > con = self.exit_stack.enter_context(con.connect()) 2414s 2414s /usr/lib/python3/dist-packages/pandas/io/sql.py:1636: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _ test_read_sql_dtype[read_sql_query-numpy_nullable-postgresql_psycopg2_conn] __ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s conn = 'postgresql_psycopg2_conn' 2414s request = > 2414s func = 'read_sql_query', dtype_backend = 'numpy_nullable' 2414s 2414s @pytest.mark.parametrize("conn", all_connectable) 2414s @pytest.mark.parametrize("dtype_backend", [lib.no_default, "numpy_nullable"]) 2414s @pytest.mark.parametrize("func", ["read_sql", "read_sql_query"]) 2414s def test_read_sql_dtype(conn, request, func, dtype_backend): 2414s # GH#50797 2414s > conn = request.getfixturevalue(conn) 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3757: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def getfixturevalue(self, argname: str) -> Any: 2414s """Dynamically run a named fixture function. 2414s 2414s Declaring fixtures via function argument is recommended where possible. 2414s But if you can only decide whether to use another fixture at test 2414s setup time, you may use this function to retrieve it inside a fixture 2414s or test function body. 2414s 2414s This method can be used during the test setup phase or the test run 2414s phase, but during the test teardown phase a fixture's value may not 2414s be available. 2414s 2414s :param argname: 2414s The fixture name. 2414s :raises pytest.FixtureLookupError: 2414s If the given fixture could not be found. 2414s """ 2414s # Note that in addition to the use case described in the docstring, 2414s # getfixturevalue() is also called by pytest itself during item and fixture 2414s # setup to evaluate the fixtures that are requested statically 2414s # (using function parameters, autouse, etc). 2414s 2414s > fixturedef = self._get_active_fixturedef(argname) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:532: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = > 2414s argname = 'postgresql_psycopg2_conn' 2414s 2414s def _get_active_fixturedef( 2414s self, argname: str 2414s ) -> FixtureDef[object] | PseudoFixtureDef[object]: 2414s if argname == "request": 2414s cached_result = (self, [0], None) 2414s return PseudoFixtureDef(cached_result, Scope.Function) 2414s 2414s # If we already finished computing a fixture by this name in this item, 2414s # return it. 2414s fixturedef = self._fixture_defs.get(argname) 2414s if fixturedef is not None: 2414s self._check_scope(fixturedef, fixturedef._scope) 2414s return fixturedef 2414s 2414s # Find the appropriate fixturedef. 2414s fixturedefs = self._arg2fixturedefs.get(argname, None) 2414s if fixturedefs is None: 2414s # We arrive here because of a dynamic call to 2414s # getfixturevalue(argname) which was naturally 2414s # not known at parsing/collection time. 2414s fixturedefs = self._fixturemanager.getfixturedefs(argname, self._pyfuncitem) 2414s if fixturedefs is not None: 2414s self._arg2fixturedefs[argname] = fixturedefs 2414s # No fixtures defined with this name. 2414s if fixturedefs is None: 2414s raise FixtureLookupError(argname, self) 2414s # The are no fixtures with this name applicable for the function. 2414s if not fixturedefs: 2414s raise FixtureLookupError(argname, self) 2414s # A fixture may override another fixture with the same name, e.g. a 2414s # fixture in a module can override a fixture in a conftest, a fixture in 2414s # a class can override a fixture in the module, and so on. 2414s # An overriding fixture can request its own name (possibly indirectly); 2414s # in this case it gets the value of the fixture it overrides, one level 2414s # up. 2414s # Check how many `argname`s deep we are, and take the next one. 2414s # `fixturedefs` is sorted from furthest to closest, so use negative 2414s # indexing to go in reverse. 2414s index = -1 2414s for request in self._iter_chain(): 2414s if request.fixturename == argname: 2414s index -= 1 2414s # If already consumed all of the available levels, fail. 2414s if -index > len(fixturedefs): 2414s raise FixtureLookupError(argname, self) 2414s fixturedef = fixturedefs[index] 2414s 2414s # Prepare a SubRequest object for calling the fixture. 2414s try: 2414s callspec = self._pyfuncitem.callspec 2414s except AttributeError: 2414s callspec = None 2414s if callspec is not None and argname in callspec.params: 2414s param = callspec.params[argname] 2414s param_index = callspec.indices[argname] 2414s # The parametrize invocation scope overrides the fixture's scope. 2414s scope = callspec._arg2scope[argname] 2414s else: 2414s param = NOTSET 2414s param_index = 0 2414s scope = fixturedef._scope 2414s self._check_fixturedef_without_param(fixturedef) 2414s self._check_scope(fixturedef, scope) 2414s subrequest = SubRequest( 2414s self, scope, param, param_index, fixturedef, _ispytest=True 2414s ) 2414s 2414s # Make sure the fixture value is cached, running it if it isn't 2414s > fixturedef.execute(request=subrequest) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:617: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s request = > 2414s 2414s def execute(self, request: SubRequest) -> FixtureValue: 2414s """Return the value of this fixture, executing it if not cached.""" 2414s # Ensure that the dependent fixtures requested by this fixture are loaded. 2414s # This needs to be done before checking if we have a cached value, since 2414s # if a dependent fixture has their cache invalidated, e.g. due to 2414s # parametrization, they finalize themselves and fixtures depending on it 2414s # (which will likely include this fixture) setting `self.cached_result = None`. 2414s # See #4871 2414s requested_fixtures_that_should_finalize_us = [] 2414s for argname in self.argnames: 2414s fixturedef = request._get_active_fixturedef(argname) 2414s # Saves requested fixtures in a list so we later can add our finalizer 2414s # to them, ensuring that if a requested fixture gets torn down we get torn 2414s # down first. This is generally handled by SetupState, but still currently 2414s # needed when this fixture is not parametrized but depends on a parametrized 2414s # fixture. 2414s if not isinstance(fixturedef, PseudoFixtureDef): 2414s requested_fixtures_that_should_finalize_us.append(fixturedef) 2414s 2414s # Check for (and return) cached value/exception. 2414s if self.cached_result is not None: 2414s request_cache_key = self.cache_key(request) 2414s cache_key = self.cached_result[1] 2414s try: 2414s # Attempt to make a normal == check: this might fail for objects 2414s # which do not implement the standard comparison (like numpy arrays -- #6497). 2414s cache_hit = bool(request_cache_key == cache_key) 2414s except (ValueError, RuntimeError): 2414s # If the comparison raises, use 'is' as fallback. 2414s cache_hit = request_cache_key is cache_key 2414s 2414s if cache_hit: 2414s if self.cached_result[2] is not None: 2414s exc, exc_tb = self.cached_result[2] 2414s raise exc.with_traceback(exc_tb) 2414s else: 2414s result = self.cached_result[0] 2414s return result 2414s # We have a previous but differently parametrized fixture instance 2414s # so we need to tear it down before creating a new one. 2414s self.finish(request) 2414s assert self.cached_result is None 2414s 2414s # Add finalizer to requested fixtures we saved previously. 2414s # We make sure to do this after checking for cached value to avoid 2414s # adding our finalizer multiple times. (#12135) 2414s finalizer = functools.partial(self.finish, request=request) 2414s for parent_fixture in requested_fixtures_that_should_finalize_us: 2414s parent_fixture.addfinalizer(finalizer) 2414s 2414s ihook = request.node.ihook 2414s try: 2414s # Setup the fixture, run the code in it, and cache the value 2414s # in self.cached_result 2414s > result = ihook.pytest_fixture_setup(fixturedef=self, request=request) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1091: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def __call__(self, **kwargs: object) -> Any: 2414s """Call the hook. 2414s 2414s Only accepts keyword arguments, which should match the hook 2414s specification. 2414s 2414s Returns the result(s) of calling all registered plugins, see 2414s :ref:`calling`. 2414s """ 2414s assert ( 2414s not self.is_historic() 2414s ), "Cannot directly call a historic hook - use call_historic instead." 2414s self._verify_all_args_are_provided(kwargs) 2414s firstresult = self.spec.opts.get("firstresult", False) if self.spec else False 2414s # Copy because plugins may register other plugins during iteration (#438). 2414s > return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_hooks.py:513: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = <_pytest.config.PytestPluginManager object at 0x79e9f6ccc050> 2414s hook_name = 'pytest_fixture_setup' 2414s methods = [>] 2414s kwargs = {'fixturedef': , 'request': >} 2414s firstresult = True 2414s 2414s def _hookexec( 2414s self, 2414s hook_name: str, 2414s methods: Sequence[HookImpl], 2414s kwargs: Mapping[str, object], 2414s firstresult: bool, 2414s ) -> object | list[object]: 2414s # called from all hookcaller instances. 2414s # enable_tracing will set its own wrapping function at self._inner_hookexec 2414s > return self._inner_hookexec(hook_name, methods, kwargs, firstresult) 2414s 2414s /usr/lib/python3/dist-packages/pluggy/_manager.py:120: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s @pytest.hookimpl(wrapper=True) 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[object], request: SubRequest 2414s ) -> Generator[None, object, object]: 2414s try: 2414s > return (yield) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/setuponly.py:36: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturedef = 2414s request = > 2414s 2414s def pytest_fixture_setup( 2414s fixturedef: FixtureDef[FixtureValue], request: SubRequest 2414s ) -> FixtureValue: 2414s """Execution of fixture setup.""" 2414s kwargs = {} 2414s for argname in fixturedef.argnames: 2414s kwargs[argname] = request.getfixturevalue(argname) 2414s 2414s fixturefunc = resolve_fixture_function(fixturedef, request) 2414s my_cache_key = fixturedef.cache_key(request) 2414s try: 2414s > result = call_fixture_func(fixturefunc, request, kwargs) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:1140: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s fixturefunc = 2414s request = > 2414s kwargs = {'postgresql_psycopg2_engine': Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas)} 2414s 2414s def call_fixture_func( 2414s fixturefunc: _FixtureFunc[FixtureValue], request: FixtureRequest, kwargs 2414s ) -> FixtureValue: 2414s if is_generator(fixturefunc): 2414s fixturefunc = cast( 2414s Callable[..., Generator[FixtureValue, None, None]], fixturefunc 2414s ) 2414s generator = fixturefunc(**kwargs) 2414s try: 2414s > fixture_result = next(generator) 2414s 2414s /usr/lib/python3/dist-packages/_pytest/fixtures.py:891: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.fixture 2414s def postgresql_psycopg2_conn(postgresql_psycopg2_engine): 2414s > with postgresql_psycopg2_engine.connect() as conn: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:681: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _________________________ test_psycopg2_schema_support _________________________ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.mark.db 2414s def test_psycopg2_schema_support(postgresql_psycopg2_engine): 2414s conn = postgresql_psycopg2_engine 2414s 2414s # only test this for postgresql (schema's not supported in 2414s # mysql/sqlite) 2414s df = DataFrame({"col1": [1, 2], "col2": [0.1, 0.2], "col3": ["a", "n"]}) 2414s 2414s # create a schema 2414s > with conn.connect() as con: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3905: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s _________________________ test_self_join_date_columns __________________________ 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s 2414s The above exception was the direct cause of the following exception: 2414s 2414s postgresql_psycopg2_engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s @pytest.mark.db 2414s def test_self_join_date_columns(postgresql_psycopg2_engine): 2414s # GH 44421 2414s conn = postgresql_psycopg2_engine 2414s from sqlalchemy.sql import text 2414s 2414s create_table = text( 2414s """ 2414s CREATE TABLE person 2414s ( 2414s id serial constraint person_pkey primary key, 2414s created_dt timestamp with time zone 2414s ); 2414s 2414s INSERT INTO person 2414s VALUES (1, '2021-01-01T00:00:00Z'); 2414s """ 2414s ) 2414s > with conn.connect() as con: 2414s 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3989: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def connect(self) -> Connection: 2414s """Return a new :class:`_engine.Connection` object. 2414s 2414s The :class:`_engine.Connection` acts as a Python context manager, so 2414s the typical use of this method looks like:: 2414s 2414s with engine.connect() as connection: 2414s connection.execute(text("insert into table values ('foo')")) 2414s connection.commit() 2414s 2414s Where above, after the block is completed, the connection is "closed" 2414s and its underlying DBAPI resources are returned to the connection pool. 2414s This also has the effect of rolling back any transaction that 2414s was explicitly begun or was begun via autobegin, and will 2414s emit the :meth:`_events.ConnectionEvents.rollback` event if one was 2414s started and is still in progress. 2414s 2414s .. seealso:: 2414s 2414s :meth:`_engine.Engine.begin` 2414s 2414s """ 2414s 2414s > return self._connection_cls(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3278: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s self._dbapi_connection = engine.raw_connection() 2414s except dialect.loaded_dbapi.Error as err: 2414s > Connection._handle_dbapi_exception_noconnection( 2414s err, dialect, engine 2414s ) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:148: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s e = OperationalError('connection to server at "localhost" (::1), port 5432 failed: Connection refused\n\tIs the server run....0.1), port 5432 failed: Connection refused\n\tIs the server running on that host and accepting TCP/IP connections?\n') 2414s dialect = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s is_disconnect = False, invalidate_pool_on_disconnect = True, is_pre_ping = False 2414s 2414s @classmethod 2414s def _handle_dbapi_exception_noconnection( 2414s cls, 2414s e: BaseException, 2414s dialect: Dialect, 2414s engine: Optional[Engine] = None, 2414s is_disconnect: Optional[bool] = None, 2414s invalidate_pool_on_disconnect: bool = True, 2414s is_pre_ping: bool = False, 2414s ) -> NoReturn: 2414s exc_info = sys.exc_info() 2414s 2414s if is_disconnect is None: 2414s is_disconnect = isinstance( 2414s e, dialect.loaded_dbapi.Error 2414s ) and dialect.is_disconnect(e, None, None) 2414s 2414s should_wrap = isinstance(e, dialect.loaded_dbapi.Error) 2414s 2414s if should_wrap: 2414s sqlalchemy_exception = exc.DBAPIError.instance( 2414s None, 2414s None, 2414s cast(Exception, e), 2414s dialect.loaded_dbapi.Error, 2414s hide_parameters=( 2414s engine.hide_parameters if engine is not None else False 2414s ), 2414s connection_invalidated=is_disconnect, 2414s dialect=dialect, 2414s ) 2414s else: 2414s sqlalchemy_exception = None 2414s 2414s newraise = None 2414s 2414s if dialect._has_events: 2414s ctx = ExceptionContextImpl( 2414s e, 2414s sqlalchemy_exception, 2414s engine, 2414s dialect, 2414s None, 2414s None, 2414s None, 2414s None, 2414s None, 2414s is_disconnect, 2414s invalidate_pool_on_disconnect, 2414s is_pre_ping, 2414s ) 2414s for fn in dialect.dispatch.handle_error: 2414s try: 2414s # handler returns an exception; 2414s # call next handler in a chain 2414s per_fn = fn(ctx) 2414s if per_fn is not None: 2414s ctx.chained_exception = newraise = per_fn 2414s except Exception as _raised: 2414s # handler raises an exception - stop processing 2414s newraise = _raised 2414s break 2414s 2414s if sqlalchemy_exception and is_disconnect != ctx.is_disconnect: 2414s sqlalchemy_exception.connection_invalidated = is_disconnect = ( 2414s ctx.is_disconnect 2414s ) 2414s 2414s if newraise: 2414s raise newraise.with_traceback(exc_info[2]) from e 2414s elif should_wrap: 2414s assert sqlalchemy_exception is not None 2414s > raise sqlalchemy_exception.with_traceback(exc_info[2]) from e 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:2442: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s engine = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s connection = None, _has_events = None, _allow_revalidate = True 2414s _allow_autobegin = True 2414s 2414s def __init__( 2414s self, 2414s engine: Engine, 2414s connection: Optional[PoolProxiedConnection] = None, 2414s _has_events: Optional[bool] = None, 2414s _allow_revalidate: bool = True, 2414s _allow_autobegin: bool = True, 2414s ): 2414s """Construct a new Connection.""" 2414s self.engine = engine 2414s self.dialect = dialect = engine.dialect 2414s 2414s if connection is None: 2414s try: 2414s > self._dbapi_connection = engine.raw_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = Engine(postgresql+psycopg2://postgres:***@localhost:5432/pandas) 2414s 2414s def raw_connection(self) -> PoolProxiedConnection: 2414s """Return a "raw" DBAPI connection from the connection pool. 2414s 2414s The returned object is a proxied version of the DBAPI 2414s connection object used by the underlying driver in use. 2414s The object will have all the same behavior as the real DBAPI 2414s connection, except that its ``close()`` method will result in the 2414s connection being returned to the pool, rather than being closed 2414s for real. 2414s 2414s This method provides direct DBAPI connection access for 2414s special situations when the API provided by 2414s :class:`_engine.Connection` 2414s is not needed. When a :class:`_engine.Connection` object is already 2414s present, the DBAPI connection is available using 2414s the :attr:`_engine.Connection.connection` accessor. 2414s 2414s .. seealso:: 2414s 2414s :ref:`dbapi_connections` 2414s 2414s """ 2414s > return self.pool.connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/base.py:3302: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def connect(self) -> PoolProxiedConnection: 2414s """Return a DBAPI connection from the pool. 2414s 2414s The connection is instrumented such that when its 2414s ``close()`` method is called, the connection will be returned to 2414s the pool. 2414s 2414s """ 2414s > return _ConnectionFairy._checkout(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:449: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s threadconns = None, fairy = None 2414s 2414s @classmethod 2414s def _checkout( 2414s cls, 2414s pool: Pool, 2414s threadconns: Optional[threading.local] = None, 2414s fairy: Optional[_ConnectionFairy] = None, 2414s ) -> _ConnectionFairy: 2414s if not fairy: 2414s > fairy = _ConnectionRecord.checkout(pool) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:1263: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s cls = 2414s pool = 2414s 2414s @classmethod 2414s def checkout(cls, pool: Pool) -> _ConnectionFairy: 2414s if TYPE_CHECKING: 2414s rec = cast(_ConnectionRecord, pool._do_get()) 2414s else: 2414s > rec = pool._do_get() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:712: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _do_get(self) -> ConnectionPoolEntry: 2414s > return self._create_connection() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/impl.py:308: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def _create_connection(self) -> ConnectionPoolEntry: 2414s """Called by subclasses to create a new ConnectionRecord.""" 2414s 2414s > return _ConnectionRecord(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:390: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s pool = , connect = True 2414s 2414s def __init__(self, pool: Pool, connect: bool = True): 2414s self.fresh = False 2414s self.fairy_ref = None 2414s self.starttime = 0 2414s self.dbapi_connection = None 2414s 2414s self.__pool = pool 2414s if connect: 2414s > self.__connect() 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:674: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s self.dbapi_connection = connection = pool._invoke_creator(self) 2414s pool.logger.debug("Created new connection %r", connection) 2414s self.fresh = True 2414s except BaseException as e: 2414s > with util.safe_reraise(): 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:900: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s type_ = None, value = None, traceback = None 2414s 2414s def __exit__( 2414s self, 2414s type_: Optional[Type[BaseException]], 2414s value: Optional[BaseException], 2414s traceback: Optional[types.TracebackType], 2414s ) -> NoReturn: 2414s assert self._exc_info is not None 2414s # see #2703 for notes 2414s if type_ is None: 2414s exc_type, exc_value, exc_tb = self._exc_info 2414s assert exc_value is not None 2414s self._exc_info = None # remove potential circular references 2414s > raise exc_value.with_traceback(exc_tb) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/util/langhelpers.py:146: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s 2414s def __connect(self) -> None: 2414s pool = self.__pool 2414s 2414s # ensure any existing connection is removed, so that if 2414s # creator fails, this attribute stays None 2414s self.dbapi_connection = None 2414s try: 2414s self.starttime = time.time() 2414s > self.dbapi_connection = connection = pool._invoke_creator(self) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/pool/base.py:896: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s connection_record = 2414s 2414s def connect( 2414s connection_record: Optional[ConnectionPoolEntry] = None, 2414s ) -> DBAPIConnection: 2414s if dialect._has_events: 2414s for fn in dialect.dispatch.do_connect: 2414s connection = cast( 2414s DBAPIConnection, 2414s fn(dialect, connection_record, cargs, cparams), 2414s ) 2414s if connection is not None: 2414s return connection 2414s 2414s > return dialect.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/create.py:643: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s self = 2414s cargs = () 2414s cparams = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s 2414s def connect(self, *cargs, **cparams): 2414s # inherits the docstring from interfaces.Dialect.connect 2414s > return self.loaded_dbapi.connect(*cargs, **cparams) 2414s 2414s /usr/lib/python3/dist-packages/sqlalchemy/engine/default.py:621: 2414s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2414s 2414s dsn = 'host=localhost dbname=pandas user=postgres password=postgres port=5432' 2414s connection_factory = None, cursor_factory = None 2414s kwargs = {'dbname': 'pandas', 'host': 'localhost', 'password': 'postgres', 'port': 5432, ...} 2414s kwasync = {} 2414s 2414s def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs): 2414s """ 2414s Create a new database connection. 2414s 2414s The connection parameters can be specified as a string: 2414s 2414s conn = psycopg2.connect("dbname=test user=postgres password=secret") 2414s 2414s or using a set of keyword arguments: 2414s 2414s conn = psycopg2.connect(database="test", user="postgres", password="secret") 2414s 2414s Or as a mix of both. The basic connection parameters are: 2414s 2414s - *dbname*: the database name 2414s - *database*: the database name (only as keyword argument) 2414s - *user*: user name used to authenticate 2414s - *password*: password used to authenticate 2414s - *host*: database host address (defaults to UNIX socket if not provided) 2414s - *port*: connection port number (defaults to 5432 if not provided) 2414s 2414s Using the *connection_factory* parameter a different class or connections 2414s factory can be specified. It should be a callable object taking a dsn 2414s argument. 2414s 2414s Using the *cursor_factory* parameter, a new default cursor factory will be 2414s used by cursor(). 2414s 2414s Using *async*=True an asynchronous connection will be created. *async_* is 2414s a valid alias (for Python versions where ``async`` is a keyword). 2414s 2414s Any other keyword parameter will be passed to the underlying client 2414s library: the list of supported parameters depends on the library version. 2414s 2414s """ 2414s kwasync = {} 2414s if 'async' in kwargs: 2414s kwasync['async'] = kwargs.pop('async') 2414s if 'async_' in kwargs: 2414s kwasync['async_'] = kwargs.pop('async_') 2414s 2414s dsn = _ext.make_dsn(dsn, **kwargs) 2414s > conn = _connect(dsn, connection_factory=connection_factory, **kwasync) 2414s E sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused 2414s E Is the server running on that host and accepting TCP/IP connections? 2414s E 2414s E (Background on this error at: https://sqlalche.me/e/20/e3q8) 2414s 2414s /usr/lib/python3/dist-packages/psycopg2/__init__.py:122: OperationalError 2414s =============================== warnings summary =============================== 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:895 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:895: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("mysql_pymysql_engine", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:896 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:896: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("mysql_pymysql_conn", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:900 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:900: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("mysql_pymysql_engine_iris", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:901 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:901: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("mysql_pymysql_conn_iris", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:905 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:905: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("mysql_pymysql_engine_types", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:906 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:906: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("mysql_pymysql_conn_types", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:910 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:910: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("postgresql_psycopg2_engine", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:911 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:911: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("postgresql_psycopg2_conn", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:915 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:915: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("postgresql_psycopg2_engine_iris", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:916 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:916: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("postgresql_psycopg2_conn_iris", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:920 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:920: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("postgresql_psycopg2_engine_types", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:921 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:921: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("postgresql_psycopg2_conn_types", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:954 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:954: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("postgresql_adbc_conn", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:958 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:958: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("postgresql_adbc_iris", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:959 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:959: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("sqlite_adbc_iris", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:963 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:963: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("postgresql_adbc_types", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:964 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:964: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s pytest.param("sqlite_adbc_types", marks=pytest.mark.db), 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3896 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3896: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s @pytest.mark.db 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3971 2414s /usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py:3971: PytestUnknownMarkWarning: Unknown pytest.mark.db - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s @pytest.mark.db 2414s 2414s ../../../usr/lib/python3/dist-packages/pandas/tests/tools/test_to_datetime.py:1143 2414s /usr/lib/python3/dist-packages/pandas/tests/tools/test_to_datetime.py:1143: PytestUnknownMarkWarning: Unknown pytest.mark.skip_ubsan - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2414s @pytest.mark.skip_ubsan 2414s 2414s io/test_sql.py: 1535 warnings 2414s tools/test_to_datetime.py: 978 warnings 2414s /usr/lib/python3/dist-packages/py/_process/forkedfunc.py:45: DeprecationWarning: This process (pid=20744) is multi-threaded, use of fork() may lead to deadlocks in the child. 2414s pid = os.fork() 2414s 2414s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 2414s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/tests/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/tests/pytest-cache-files-ew78zh0a' 2414s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 2414s 2414s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:429 2414s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:429: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/tests/.pytest_cache/v/cache/lastfailed: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/tests/pytest-cache-files-8bnihkh6' 2414s config.cache.set("cache/lastfailed", self.lastfailed) 2414s 2414s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 2414s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/tests/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/tests/pytest-cache-files-eqaucmt5' 2414s session.config.cache.set(STEPWISE_CACHE_DIR, []) 2414s 2414s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 2414s =========================== short test summary info ============================ 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dataframe_to_sql[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dataframe_to_sql[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dataframe_to_sql[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dataframe_to_sql[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dataframe_to_sql_empty[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dataframe_to_sql_empty[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dataframe_to_sql_empty[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dataframe_to_sql_empty[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql[None-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql[None-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql[None-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql[None-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql[multi-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql[multi-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql[multi-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql[multi-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist[replace-1-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist[replace-1-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist[replace-1-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist[replace-1-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist[append-2-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist[append-2-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist[append-2-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist[append-2-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist_fail[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist_fail[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist_fail[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist_fail[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_chunksize[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_chunksize[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_chunksize[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_chunksize[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_expression_with_parameter[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_expression_with_parameter[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_expression_with_parameter[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_expression_with_parameter[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_string_with_parameter[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_string_with_parameter[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_string_with_parameter[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_string_with_parameter[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_table[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_table[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_table[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_table[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_table_chunksize[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_table_chunksize[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_table_chunksize[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_table_chunksize[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_callable[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_callable[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_callable[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_callable[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_default_type_conversion[mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_default_type_conversion[mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_default_type_conversion[postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_default_type_conversion[postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_procedure[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_procedure[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_copy_from_callable_insertion_method[2-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_copy_from_callable_insertion_method[2-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_copy_from_callable_insertion_method[Success!-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_copy_from_callable_insertion_method[Success!-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_insertion_method_on_conflict_do_nothing[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_insertion_method_on_conflict_do_nothing[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_on_public_schema[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_on_public_schema[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_insertion_method_on_conflict_update[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_insertion_method_on_conflict_update[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_view_postgres[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_view_postgres[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_iris_parameter[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_iris_parameter[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_iris_parameter[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_iris_parameter[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_iris_named_parameter[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_iris_named_parameter[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_iris_named_parameter[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_iris_named_parameter[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_view[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_view[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_view[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_view[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_with_chunksize_no_result[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_with_chunksize_no_result[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_with_chunksize_no_result[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_with_chunksize_no_result[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_fail[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_fail[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_fail[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_fail[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_replace[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_replace[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_replace[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_replace[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_append[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_append[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_append[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_append[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_type_mapping[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_type_mapping[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_type_mapping[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_type_mapping[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_series[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_series[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_series[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_series[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_roundtrip[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_roundtrip[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_roundtrip[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_roundtrip[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_roundtrip_chunksize[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_roundtrip_chunksize[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_roundtrip_chunksize[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_roundtrip_chunksize[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_execute_sql[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_execute_sql[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_execute_sql[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_execute_sql[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_date_parsing[mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_date_parsing[mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_date_parsing[postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_date_parsing[postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-ignore-mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-ignore-mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-ignore-postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-ignore-postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-raise-mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-raise-mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-raise-postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-raise-postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-coerce-mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-coerce-mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-coerce-postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-coerce-postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-ignore-mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-ignore-mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-ignore-postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-ignore-postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-raise-mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-raise-mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-raise-postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-raise-postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-coerce-mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-coerce-mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-coerce-postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-coerce-postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_date_and_index[mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_date_and_index[mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_date_and_index[postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_date_and_index[postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_timedelta[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_timedelta[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_timedelta[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_timedelta[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_complex_raises[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_complex_raises[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_complex_raises[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_complex_raises[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-None-index-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-None-index-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-None-index-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-None-index-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-other_label-other_label-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-other_label-other_label-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-other_label-other_label-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-other_label-other_label-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[index_name-None-index_name-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[index_name-None-index_name-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[index_name-None-index_name-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[index_name-None-index_name-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[index_name-other_label-other_label-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[index_name-other_label-other_label-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[index_name-other_label-other_label-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[index_name-other_label-other_label-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[0-None-0-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[0-None-0-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[0-None-0-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[0-None-0-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-0-0-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-0-0-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-0-0-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-0-0-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label_multiindex[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label_multiindex[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_multiindex_roundtrip[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_multiindex_roundtrip[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_multiindex_roundtrip[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_multiindex_roundtrip[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[None-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[None-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[None-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[None-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[int-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[int-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[int-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[int-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[float-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[float-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[float-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[float-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[dtype3-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[dtype3-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[dtype3-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[dtype3-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_integer_col_names[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_integer_col_names[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_integer_col_names[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_integer_col_names[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_with_schema[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_with_schema[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_with_schema[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_with_schema[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_dtypes[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_dtypes[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_dtypes[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_dtypes[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_keys[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_keys[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_keys[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_keys[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_chunksize_read[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_chunksize_read[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_chunksize_read[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_chunksize_read[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_categorical[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_categorical[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_categorical[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_categorical[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_unicode_column_name[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_unicode_column_name[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_unicode_column_name[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_unicode_column_name[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_escaped_table_name[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_escaped_table_name[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_escaped_table_name[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_escaped_table_name[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_duplicate_columns[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_duplicate_columns[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_duplicate_columns[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_duplicate_columns[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_columns[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_columns[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_columns[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_columns[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_index_col[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_index_col[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_index_col[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_index_col[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_delegate[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_delegate[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_delegate[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_delegate[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_warning_case_insensitive_table_name[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_warning_case_insensitive_table_name[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_warning_case_insensitive_table_name[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_warning_case_insensitive_table_name[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_type_mapping[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_type_mapping[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_type_mapping[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_type_mapping[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int8-SMALLINT-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int8-SMALLINT-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int8-SMALLINT-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int8-SMALLINT-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int8-SMALLINT-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int8-SMALLINT-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int8-SMALLINT-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int8-SMALLINT-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint8-SMALLINT-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint8-SMALLINT-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint8-SMALLINT-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint8-SMALLINT-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt8-SMALLINT-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt8-SMALLINT-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt8-SMALLINT-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt8-SMALLINT-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int16-SMALLINT-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int16-SMALLINT-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int16-SMALLINT-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int16-SMALLINT-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int16-SMALLINT-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int16-SMALLINT-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int16-SMALLINT-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int16-SMALLINT-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint16-INTEGER-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint16-INTEGER-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint16-INTEGER-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint16-INTEGER-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt16-INTEGER-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt16-INTEGER-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt16-INTEGER-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt16-INTEGER-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int32-INTEGER-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int32-INTEGER-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int32-INTEGER-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int32-INTEGER-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int32-INTEGER-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int32-INTEGER-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int32-INTEGER-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int32-INTEGER-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint32-BIGINT-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint32-BIGINT-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint32-BIGINT-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint32-BIGINT-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt32-BIGINT-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt32-BIGINT-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt32-BIGINT-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt32-BIGINT-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int64-BIGINT-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int64-BIGINT-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int64-BIGINT-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int64-BIGINT-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int64-BIGINT-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int64-BIGINT-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int64-BIGINT-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int64-BIGINT-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int-BIGINT-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int-BIGINT-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int-BIGINT-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int-BIGINT-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_overload_mapping[uint64-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_overload_mapping[uint64-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_overload_mapping[uint64-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_overload_mapping[uint64-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_overload_mapping[UInt64-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_overload_mapping[UInt64-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_overload_mapping[UInt64-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_overload_mapping[UInt64-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_database_uri_string[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_database_uri_string[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_database_uri_string[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_pg8000_sqlalchemy_passthrough_error[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_pg8000_sqlalchemy_passthrough_error[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_pg8000_sqlalchemy_passthrough_error[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_query_by_text_obj[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_query_by_text_obj[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_query_by_text_obj[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_query_by_text_obj[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_query_by_select_obj[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_query_by_select_obj[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_query_by_select_obj[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_query_by_select_obj[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_column_with_percentage[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_column_with_percentage[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_column_with_percentage[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_column_with_percentage[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_create_table[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_create_table[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_create_table[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_create_table[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_drop_table[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_drop_table[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_drop_table[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_drop_table[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_roundtrip[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_roundtrip[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_roundtrip[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_roundtrip[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_execute_sql[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_execute_sql[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_execute_sql[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_execute_sql[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_read_table[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_read_table[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_read_table[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_read_table[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_read_table_columns[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_read_table_columns[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_read_table_columns[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_read_table_columns[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_absent_raises[mysql_pymysql_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_absent_raises[mysql_pymysql_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_absent_raises[postgresql_psycopg2_engine_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_absent_raises[postgresql_psycopg2_conn_iris] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_default_type_conversion[postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_default_type_conversion[postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_bigint[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_bigint[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_bigint[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_bigint[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_default_date_load[mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_default_date_load[mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_default_date_load[postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_default_date_load[postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_query[None-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_query[None-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_query[parse_dates1-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_query[parse_dates1-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_query_chunksize[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_query_chunksize[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_table[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_table[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_roundtrip[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_roundtrip[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_roundtrip[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_roundtrip[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_out_of_bounds_datetime[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_out_of_bounds_datetime[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_out_of_bounds_datetime[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_out_of_bounds_datetime[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_naive_datetimeindex_roundtrip[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_naive_datetimeindex_roundtrip[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_naive_datetimeindex_roundtrip[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_naive_datetimeindex_roundtrip[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_date_parsing[mysql_pymysql_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_date_parsing[mysql_pymysql_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_date_parsing[postgresql_psycopg2_engine_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_date_parsing[postgresql_psycopg2_conn_types] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_NaT[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_NaT[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_NaT[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_NaT[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_date[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_date[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_date[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_date[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_time[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_time[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_time[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_time[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_mixed_dtype_insert[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_mixed_dtype_insert[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_mixed_dtype_insert[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_mixed_dtype_insert[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_numeric[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_numeric[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_numeric[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_numeric[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_fullcolumn[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_fullcolumn[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_fullcolumn[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_fullcolumn[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_string[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_string[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_string[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_string[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_save_index[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_save_index[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_save_index[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_save_index[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_transactions[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_transactions[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_transactions[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_transactions[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_transaction_rollback[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_transaction_rollback[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_transaction_rollback[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_transaction_rollback[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_get_schema_create_table[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_get_schema_create_table[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_get_schema_create_table[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_get_schema_create_table[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dtype[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dtype[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dtype[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dtype[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_notna_dtype[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_notna_dtype[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_notna_dtype[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_notna_dtype[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_double_precision[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_double_precision[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_double_precision[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_double_precision[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_connectable_issue_example[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_connectable_issue_example[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_connectable_issue_example[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_connectable_issue_example[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input0-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input0-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input0-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input0-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input1-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input1-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input1-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input1-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input2-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input2-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input2-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input2-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_temporary_table[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_temporary_table[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_temporary_table[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_temporary_table[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_invalid_engine[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_invalid_engine[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_invalid_engine[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_invalid_engine[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_sql_engine[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_sql_engine[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_sql_engine[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_sql_engine[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_options_sqlalchemy[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_options_sqlalchemy[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_options_sqlalchemy[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_options_sqlalchemy[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_options_auto[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_options_auto[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_options_auto[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_options_auto[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend[python-numpy_nullable-read_sql-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend[python-numpy_nullable-read_sql-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend[python-numpy_nullable-read_sql-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend[python-numpy_nullable-read_sql-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend[python-numpy_nullable-read_sql_query-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend[python-numpy_nullable-read_sql_query-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend[python-numpy_nullable-read_sql_query-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend[python-numpy_nullable-read_sql_query-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql_table-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql_table-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql_table-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql_table-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql_table-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql_table-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql_table-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql_table-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql_query-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql_query-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql_query-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql_query-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_chunksize_empty_dtypes[mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_chunksize_empty_dtypes[mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_chunksize_empty_dtypes[postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_chunksize_empty_dtypes[postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql-_NoDefault.no_default-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql-_NoDefault.no_default-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql-_NoDefault.no_default-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql-_NoDefault.no_default-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql-numpy_nullable-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql-numpy_nullable-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql-numpy_nullable-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql-numpy_nullable-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql_query-_NoDefault.no_default-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql_query-_NoDefault.no_default-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql_query-_NoDefault.no_default-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql_query-_NoDefault.no_default-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql_query-numpy_nullable-mysql_pymysql_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql_query-numpy_nullable-mysql_pymysql_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql_query-numpy_nullable-postgresql_psycopg2_engine] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql_query-numpy_nullable-postgresql_psycopg2_conn] 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_psycopg2_schema_support 2414s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_self_join_date_columns 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dataframe_to_sql[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dataframe_to_sql[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dataframe_to_sql_empty[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dataframe_to_sql_empty[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql[None-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql[None-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql[multi-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql[multi-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist[replace-1-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist[replace-1-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist[append-2-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist[append-2-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist_fail[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_exist_fail[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_chunksize[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_chunksize[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_expression_with_parameter[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_expression_with_parameter[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_string_with_parameter[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_query_string_with_parameter[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_table[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_table[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_table_chunksize[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_iris_table_chunksize[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_callable[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_callable[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_default_type_conversion[postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_default_type_conversion[postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_copy_from_callable_insertion_method[2-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_copy_from_callable_insertion_method[2-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_copy_from_callable_insertion_method[Success!-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_copy_from_callable_insertion_method[Success!-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_insertion_method_on_conflict_do_nothing[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_insertion_method_on_conflict_do_nothing[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_on_public_schema[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_on_public_schema[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_view_postgres[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_view_postgres[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_iris_parameter[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_iris_parameter[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_iris_named_parameter[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_iris_named_parameter[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_view[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_view[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_with_chunksize_no_result[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_with_chunksize_no_result[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_fail[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_fail[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_replace[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_replace[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_append[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_append[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_type_mapping[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_type_mapping[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_series[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_series[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_roundtrip[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_roundtrip[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_roundtrip_chunksize[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_roundtrip_chunksize[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_execute_sql[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_execute_sql[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_date_parsing[postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_date_parsing[postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-ignore-postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-raise-postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-SELECT * FROM types-mode0-coerce-postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-ignore-postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-ignore-postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-raise-postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-raise-postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-coerce-postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql-types-sqlalchemy-coerce-postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-ignore-postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-raise-postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_query-SELECT * FROM types-mode2-coerce-postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-ignore-postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-ignore-postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-raise-postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-raise-postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-coerce-postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_custom_dateparsing_error[read_sql_table-types-sqlalchemy-coerce-postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_date_and_index[postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_date_and_index[postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_timedelta[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_timedelta[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_complex_raises[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_complex_raises[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-None-index-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-None-index-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-other_label-other_label-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-other_label-other_label-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[index_name-None-index_name-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[index_name-None-index_name-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[index_name-other_label-other_label-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[index_name-other_label-other_label-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[0-None-0-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[0-None-0-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-0-0-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label[None-0-0-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label_multiindex[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_to_sql_index_label_multiindex[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_multiindex_roundtrip[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_multiindex_roundtrip[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[None-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[None-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[int-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[int-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[float-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[float-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[dtype3-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_dtype_argument[dtype3-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_integer_col_names[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_integer_col_names[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_with_schema[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_with_schema[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_dtypes[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_dtypes[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_keys[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_get_schema_keys[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_chunksize_read[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_chunksize_read[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_categorical[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_categorical[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_unicode_column_name[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_unicode_column_name[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_escaped_table_name[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_escaped_table_name[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_duplicate_columns[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_api_read_sql_duplicate_columns[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_columns[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_columns[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_index_col[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_index_col[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_delegate[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_delegate[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_warning_case_insensitive_table_name[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_warning_case_insensitive_table_name[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_type_mapping[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_type_mapping[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int8-SMALLINT-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int8-SMALLINT-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int8-SMALLINT-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int8-SMALLINT-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint8-SMALLINT-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint8-SMALLINT-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt8-SMALLINT-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt8-SMALLINT-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int16-SMALLINT-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int16-SMALLINT-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int16-SMALLINT-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int16-SMALLINT-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint16-INTEGER-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint16-INTEGER-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt16-INTEGER-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt16-INTEGER-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int32-INTEGER-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int32-INTEGER-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int32-INTEGER-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int32-INTEGER-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint32-BIGINT-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[uint32-BIGINT-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt32-BIGINT-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[UInt32-BIGINT-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int64-BIGINT-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int64-BIGINT-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int64-BIGINT-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[Int64-BIGINT-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int-BIGINT-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_mapping[int-BIGINT-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_overload_mapping[uint64-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_overload_mapping[uint64-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_overload_mapping[UInt64-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_integer_overload_mapping[UInt64-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_database_uri_string[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_database_uri_string[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_pg8000_sqlalchemy_passthrough_error[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_pg8000_sqlalchemy_passthrough_error[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_query_by_text_obj[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_query_by_text_obj[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_query_by_select_obj[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_query_by_select_obj[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_column_with_percentage[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_column_with_percentage[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_create_table[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_create_table[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_drop_table[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_drop_table[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_roundtrip[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_roundtrip[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_execute_sql[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_execute_sql[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_read_table[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_read_table[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_read_table_columns[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_read_table_columns[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_absent_raises[postgresql_psycopg2_engine_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_table_absent_raises[postgresql_psycopg2_conn_iris] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_default_type_conversion[postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_sqlalchemy_default_type_conversion[postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_bigint[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_bigint[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_default_date_load[postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_default_date_load[postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_query[None-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_query[None-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_query[parse_dates1-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_query[parse_dates1-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_query_chunksize[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_query_chunksize[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_table[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_table[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_roundtrip[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_with_timezone_roundtrip[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_out_of_bounds_datetime[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_out_of_bounds_datetime[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_naive_datetimeindex_roundtrip[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_naive_datetimeindex_roundtrip[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_date_parsing[postgresql_psycopg2_engine_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_date_parsing[postgresql_psycopg2_conn_types] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_NaT[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_NaT[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_date[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_date[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_time[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_datetime_time[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_mixed_dtype_insert[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_mixed_dtype_insert[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_numeric[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_numeric[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_fullcolumn[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_fullcolumn[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_string[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_nan_string[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_save_index[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_save_index[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_transactions[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_transactions[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_transaction_rollback[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_transaction_rollback[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_get_schema_create_table[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_get_schema_create_table[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dtype[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_dtype[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_notna_dtype[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_notna_dtype[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_double_precision[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_double_precision[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_connectable_issue_example[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_connectable_issue_example[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input0-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input0-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input1-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input1-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input2-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_negative_npinf[input2-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_temporary_table[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_temporary_table[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_invalid_engine[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_invalid_engine[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_sql_engine[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_to_sql_with_sql_engine[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_options_sqlalchemy[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_options_sqlalchemy[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_options_auto[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_options_auto[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend[python-numpy_nullable-read_sql-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend[python-numpy_nullable-read_sql-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend[python-numpy_nullable-read_sql_query-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend[python-numpy_nullable-read_sql_query-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql_table-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype_backend_table[python-numpy_nullable-read_sql_table-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql_table-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql_table-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql_query-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_invalid_dtype_backend_table[read_sql_query-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_chunksize_empty_dtypes[postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_chunksize_empty_dtypes[postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql-_NoDefault.no_default-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql-_NoDefault.no_default-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql-numpy_nullable-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql-numpy_nullable-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql_query-_NoDefault.no_default-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql_query-_NoDefault.no_default-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql_query-numpy_nullable-postgresql_psycopg2_engine] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_read_sql_dtype[read_sql_query-numpy_nullable-postgresql_psycopg2_conn] 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_psycopg2_schema_support 2414s ERROR ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py::test_self_join_date_columns 2414s = 590 failed, 1468 passed, 411 skipped, 46 xfailed, 2536 warnings, 306 errors in 358.54s (0:05:58) = 2414s === python3.12 === 2414s tests that use numba (may crash on non-x86) - checked with grep -rl -e numba pandas/tests - -m not slow because there are enough to time out otherwise 2416s ============================= test session starts ============================== 2416s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 2416s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 2416s rootdir: /usr/lib/python3/dist-packages/pandas/tests 2416s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 2416s asyncio: mode=Mode.STRICT 2416s collected 10446 items / 536 deselected / 2 skipped / 9910 selected 2416s 2418s ../../../usr/lib/python3/dist-packages/pandas/tests/frame/test_ufunc.py ....xx.........xxxxxxxx.xx....s. 2419s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_timegrouper.py ..............................s 2420s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/transform/test_numba.py sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 2421s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_numba.py sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 2421s ../../../usr/lib/python3/dist-packages/pandas/tests/util/test_numba.py . 2446s ../../../usr/lib/python3/dist-packages/pandas/tests/window/moments/test_moments_consistency_ewm.py ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 2455s ../../../usr/lib/python3/dist-packages/pandas/tests/window/moments/test_moments_consistency_expanding.py ........x.......................x..x..x..x..x..x....................x.......................x..x..x..x..x..x................................................................................................................................................................................................................................................................................ 2473s ../../../usr/lib/python3/dist-packages/pandas/tests/window/moments/test_moments_consistency_rolling.py ..............x..x............................................x..x..x..x..x..x..x..x..x..x..x..x......................................x..x............................................x..x..x..x..x..x..x..x..x..x..x..x................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 2494s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_api.py ......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 2496s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_apply.py ...s....sssss..........s..s....................................................... 2500s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_base_indexer.py .................................................................................................................................................................................................................................... 2501s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_cython_aggregations.py ........................................................................ 2559s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_dtypes.py .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 2565s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_ewm.py .......................................................................................................................................................................................................................................ssssssssssss........ssssssssssssssss................ 2573s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_expanding.py ..........x................................................................................................................................................................................................ss....s...................s..s......s............................................................................................. 2577s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_groupby.py ................................................................................................................... 2586s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_numba.py ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss 2595s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_pairwise.py ........................................................................................................................................................................................................................................................................................................................ 2615s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling.py ........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 2626s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_functions.py .................................................................................................................................................................................................................................................................................................................................................................................................................................. 2632s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_quantile.py .......................................................................................................................................................................................... 2637s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_skew_kurt.py .................................................................... 2639s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_timeseries_window.py ..................................................................................s 2728s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_win_type.py ............................................................................................................................................................................................................................................................................................... 2728s 2728s =============================== warnings summary =============================== 2728s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_numba.py:11 2728s /usr/lib/python3/dist-packages/pandas/tests/groupby/test_numba.py:11: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2728s pytestmark = pytest.mark.single_cpu 2728s 2728s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/test_timegrouper.py:944 2728s /usr/lib/python3/dist-packages/pandas/tests/groupby/test_timegrouper.py:944: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2728s @pytest.mark.single_cpu 2728s 2728s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/transform/test_numba.py:14 2728s /usr/lib/python3/dist-packages/pandas/tests/groupby/transform/test_numba.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2728s pytestmark = pytest.mark.single_cpu 2728s 2728s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_numba.py:22 2728s /usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_numba.py:22: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2728s pytestmark = pytest.mark.single_cpu 2728s 2728s ../../../usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_numba.py:230 2728s /usr/lib/python3/dist-packages/pandas/tests/groupby/aggregate/test_numba.py:230: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2728s @pytest.mark.single_cpu 2728s 2728s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_numba.py:21 2728s /usr/lib/python3/dist-packages/pandas/tests/window/test_numba.py:21: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2728s pytestmark = pytest.mark.single_cpu 2728s 2728s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_numba.py:326 2728s /usr/lib/python3/dist-packages/pandas/tests/window/test_numba.py:326: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2728s @pytest.mark.slow 2728s 2728s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_online.py:11 2728s /usr/lib/python3/dist-packages/pandas/tests/window/test_online.py:11: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2728s pytestmark = pytest.mark.single_cpu 2728s 2728s ../../../usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_skew_kurt.py:155 2728s /usr/lib/python3/dist-packages/pandas/tests/window/test_rolling_skew_kurt.py:155: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2728s @pytest.mark.slow 2728s 2728s frame/test_ufunc.py: 32 warnings 2728s groupby/test_timegrouper.py: 31 warnings 2728s groupby/transform/test_numba.py: 53 warnings 2728s groupby/aggregate/test_numba.py: 35 warnings 2728s util/test_numba.py: 1 warning 2728s window/moments/test_moments_consistency_ewm.py: 1088 warnings 2728s window/moments/test_moments_consistency_expanding.py: 380 warnings 2728s window/moments/test_moments_consistency_rolling.py: 760 warnings 2728s window/test_api.py: 937 warnings 2728s window/test_apply.py: 82 warnings 2728s window/test_base_indexer.py: 228 warnings 2728s window/test_cython_aggregations.py: 72 warnings 2728s window/test_dtypes.py: 2580 warnings 2728s window/test_ewm.py: 283 warnings 2728s window/test_expanding.py: 333 warnings 2728s window/test_groupby.py: 115 warnings 2728s window/test_numba.py: 51 warnings 2728s window/test_pairwise.py: 312 warnings 2728s window/test_rolling.py: 888 warnings 2728s window/test_rolling_functions.py: 418 warnings 2728s window/test_rolling_quantile.py: 186 warnings 2728s window/test_rolling_skew_kurt.py: 68 warnings 2728s window/test_timeseries_window.py: 83 warnings 2728s window/test_win_type.py: 287 warnings 2728s /usr/lib/python3/dist-packages/py/_process/forkedfunc.py:45: DeprecationWarning: This process (pid=23288) is multi-threaded, use of fork() may lead to deadlocks in the child. 2728s pid = os.fork() 2728s 2728s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 2728s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/tests/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/tests/pytest-cache-files-lhga9c0w' 2728s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 2728s 2728s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 2728s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/tests/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/tests/pytest-cache-files-qcto7m1q' 2728s session.config.cache.set(STEPWISE_CACHE_DIR, []) 2728s 2728s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 2728s = 9064 passed, 793 skipped, 536 deselected, 55 xfailed, 9314 warnings in 312.77s (0:05:12) = 2728s tests with a run=False xfail for hdf5 crashes - see xfail_tests_nonintel_io.patch 2729s ============================= test session starts ============================== 2729s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 2729s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 2729s rootdir: /usr/lib/python3/dist-packages/pandas/tests/io/pytables 2729s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 2729s asyncio: mode=Mode.STRICT 2729s collected 278 items 2729s 2738s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py ...................................................FFFFFFFFF................................................................................................................................... 2740s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_append.py ....F................ 2743s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_store.py ............................................F..................... 2743s 2743s =================================== FAILURES =================================== 2743s ___________________________ test_complibs[blosc2-1] ____________________________ 2743s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-275/test_complibs_blosc2_1_0') 2743s lvl = 1, lib = 'blosc2' 2743s request = > 2743s 2743s @pytest.mark.parametrize("lvl", range(10)) 2743s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2743s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2743s @pytest.mark.skipif( 2743s not PY311 and is_ci_environment() and is_platform_linux(), 2743s reason="Segfaulting in a CI environment" 2743s # with xfail, would sometimes raise UnicodeDecodeError 2743s # invalid state byte 2743s ) 2743s def test_complibs(tmp_path, lvl, lib, request): 2743s # GH14478 2743s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2743s request.applymarker( 2743s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2743s ) 2743s df = DataFrame( 2743s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2743s ) 2743s 2743s # Remove lzo if its not available on this platform 2743s if not tables.which_lib_version("lzo"): 2743s pytest.skip("lzo not available") 2743s # Remove bzip2 if its not available on this platform 2743s if not tables.which_lib_version("bzip2"): 2743s pytest.skip("bzip2 not available") 2743s 2743s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2743s gname = f"{lvl}_{lib}" 2743s 2743s # Write and read file to see if data is consistent 2743s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2743s result = read_hdf(tmpfile, gname) 2743s tm.assert_frame_equal(result, df) 2743s 2743s # Open file and check metadata for correct amount of compression 2743s with tables.open_file(tmpfile, mode="r") as h5table: 2743s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2743s assert node.filters.complevel == lvl 2743s if lvl == 0: 2743s assert node.filters.complib is None 2743s else: 2743s > assert node.filters.complib == lib 2743s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2743s E 2743s E - blosc2 2743s E + blosc2:blosclz 2743s 2743s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2743s ___________________________ test_complibs[blosc2-2] ____________________________ 2743s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-276/test_complibs_blosc2_2_0') 2743s lvl = 2, lib = 'blosc2' 2743s request = > 2743s 2743s @pytest.mark.parametrize("lvl", range(10)) 2743s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2743s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2743s @pytest.mark.skipif( 2743s not PY311 and is_ci_environment() and is_platform_linux(), 2743s reason="Segfaulting in a CI environment" 2743s # with xfail, would sometimes raise UnicodeDecodeError 2743s # invalid state byte 2743s ) 2743s def test_complibs(tmp_path, lvl, lib, request): 2743s # GH14478 2743s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2743s request.applymarker( 2743s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2743s ) 2743s df = DataFrame( 2743s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2743s ) 2743s 2743s # Remove lzo if its not available on this platform 2743s if not tables.which_lib_version("lzo"): 2743s pytest.skip("lzo not available") 2743s # Remove bzip2 if its not available on this platform 2743s if not tables.which_lib_version("bzip2"): 2743s pytest.skip("bzip2 not available") 2743s 2743s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2743s gname = f"{lvl}_{lib}" 2743s 2743s # Write and read file to see if data is consistent 2743s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2743s result = read_hdf(tmpfile, gname) 2743s tm.assert_frame_equal(result, df) 2743s 2743s # Open file and check metadata for correct amount of compression 2743s with tables.open_file(tmpfile, mode="r") as h5table: 2743s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2743s assert node.filters.complevel == lvl 2743s if lvl == 0: 2743s assert node.filters.complib is None 2743s else: 2743s > assert node.filters.complib == lib 2743s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2743s E 2743s E - blosc2 2743s E + blosc2:blosclz 2743s 2743s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2743s ___________________________ test_complibs[blosc2-3] ____________________________ 2743s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-277/test_complibs_blosc2_3_0') 2743s lvl = 3, lib = 'blosc2' 2743s request = > 2743s 2743s @pytest.mark.parametrize("lvl", range(10)) 2743s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2743s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2743s @pytest.mark.skipif( 2743s not PY311 and is_ci_environment() and is_platform_linux(), 2743s reason="Segfaulting in a CI environment" 2743s # with xfail, would sometimes raise UnicodeDecodeError 2743s # invalid state byte 2743s ) 2743s def test_complibs(tmp_path, lvl, lib, request): 2743s # GH14478 2743s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2743s request.applymarker( 2743s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2743s ) 2743s df = DataFrame( 2743s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2743s ) 2743s 2743s # Remove lzo if its not available on this platform 2743s if not tables.which_lib_version("lzo"): 2743s pytest.skip("lzo not available") 2743s # Remove bzip2 if its not available on this platform 2743s if not tables.which_lib_version("bzip2"): 2743s pytest.skip("bzip2 not available") 2743s 2743s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2743s gname = f"{lvl}_{lib}" 2743s 2743s # Write and read file to see if data is consistent 2743s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2743s result = read_hdf(tmpfile, gname) 2743s tm.assert_frame_equal(result, df) 2743s 2743s # Open file and check metadata for correct amount of compression 2743s with tables.open_file(tmpfile, mode="r") as h5table: 2743s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2743s assert node.filters.complevel == lvl 2743s if lvl == 0: 2743s assert node.filters.complib is None 2743s else: 2743s > assert node.filters.complib == lib 2743s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2743s E 2743s E - blosc2 2743s E + blosc2:blosclz 2743s 2743s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2743s ___________________________ test_complibs[blosc2-4] ____________________________ 2743s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-278/test_complibs_blosc2_4_0') 2743s lvl = 4, lib = 'blosc2' 2743s request = > 2743s 2743s @pytest.mark.parametrize("lvl", range(10)) 2743s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2743s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2743s @pytest.mark.skipif( 2743s not PY311 and is_ci_environment() and is_platform_linux(), 2743s reason="Segfaulting in a CI environment" 2743s # with xfail, would sometimes raise UnicodeDecodeError 2743s # invalid state byte 2743s ) 2743s def test_complibs(tmp_path, lvl, lib, request): 2743s # GH14478 2743s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2743s request.applymarker( 2744s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2744s ) 2744s df = DataFrame( 2744s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2744s ) 2744s 2744s # Remove lzo if its not available on this platform 2744s if not tables.which_lib_version("lzo"): 2744s pytest.skip("lzo not available") 2744s # Remove bzip2 if its not available on this platform 2744s if not tables.which_lib_version("bzip2"): 2744s pytest.skip("bzip2 not available") 2744s 2744s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2744s gname = f"{lvl}_{lib}" 2744s 2744s # Write and read file to see if data is consistent 2744s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2744s result = read_hdf(tmpfile, gname) 2744s tm.assert_frame_equal(result, df) 2744s 2744s # Open file and check metadata for correct amount of compression 2744s with tables.open_file(tmpfile, mode="r") as h5table: 2744s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2744s assert node.filters.complevel == lvl 2744s if lvl == 0: 2744s assert node.filters.complib is None 2744s else: 2744s > assert node.filters.complib == lib 2744s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2744s E 2744s E - blosc2 2744s E + blosc2:blosclz 2744s 2744s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2744s ___________________________ test_complibs[blosc2-5] ____________________________ 2744s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-279/test_complibs_blosc2_5_0') 2744s lvl = 5, lib = 'blosc2' 2744s request = > 2744s 2744s @pytest.mark.parametrize("lvl", range(10)) 2744s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2744s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2744s @pytest.mark.skipif( 2744s not PY311 and is_ci_environment() and is_platform_linux(), 2744s reason="Segfaulting in a CI environment" 2744s # with xfail, would sometimes raise UnicodeDecodeError 2744s # invalid state byte 2744s ) 2744s def test_complibs(tmp_path, lvl, lib, request): 2744s # GH14478 2744s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2744s request.applymarker( 2744s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2744s ) 2744s df = DataFrame( 2744s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2744s ) 2744s 2744s # Remove lzo if its not available on this platform 2744s if not tables.which_lib_version("lzo"): 2744s pytest.skip("lzo not available") 2744s # Remove bzip2 if its not available on this platform 2744s if not tables.which_lib_version("bzip2"): 2744s pytest.skip("bzip2 not available") 2744s 2744s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2744s gname = f"{lvl}_{lib}" 2744s 2744s # Write and read file to see if data is consistent 2744s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2744s result = read_hdf(tmpfile, gname) 2744s tm.assert_frame_equal(result, df) 2744s 2744s # Open file and check metadata for correct amount of compression 2744s with tables.open_file(tmpfile, mode="r") as h5table: 2744s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2744s assert node.filters.complevel == lvl 2744s if lvl == 0: 2744s assert node.filters.complib is None 2744s else: 2744s > assert node.filters.complib == lib 2744s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2744s E 2744s E - blosc2 2744s E + blosc2:blosclz 2744s 2744s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2744s ___________________________ test_complibs[blosc2-6] ____________________________ 2744s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-280/test_complibs_blosc2_6_0') 2744s lvl = 6, lib = 'blosc2' 2744s request = > 2744s 2744s @pytest.mark.parametrize("lvl", range(10)) 2744s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2744s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2744s @pytest.mark.skipif( 2744s not PY311 and is_ci_environment() and is_platform_linux(), 2744s reason="Segfaulting in a CI environment" 2744s # with xfail, would sometimes raise UnicodeDecodeError 2744s # invalid state byte 2744s ) 2744s def test_complibs(tmp_path, lvl, lib, request): 2744s # GH14478 2744s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2744s request.applymarker( 2744s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2744s ) 2744s df = DataFrame( 2744s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2744s ) 2744s 2744s # Remove lzo if its not available on this platform 2744s if not tables.which_lib_version("lzo"): 2744s pytest.skip("lzo not available") 2744s # Remove bzip2 if its not available on this platform 2744s if not tables.which_lib_version("bzip2"): 2744s pytest.skip("bzip2 not available") 2744s 2744s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2744s gname = f"{lvl}_{lib}" 2744s 2744s # Write and read file to see if data is consistent 2744s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2744s result = read_hdf(tmpfile, gname) 2744s tm.assert_frame_equal(result, df) 2744s 2744s # Open file and check metadata for correct amount of compression 2744s with tables.open_file(tmpfile, mode="r") as h5table: 2744s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2744s assert node.filters.complevel == lvl 2744s if lvl == 0: 2744s assert node.filters.complib is None 2744s else: 2744s > assert node.filters.complib == lib 2744s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2744s E 2744s E - blosc2 2744s E + blosc2:blosclz 2744s 2744s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2744s ___________________________ test_complibs[blosc2-7] ____________________________ 2744s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-281/test_complibs_blosc2_7_0') 2744s lvl = 7, lib = 'blosc2' 2744s request = > 2744s 2744s @pytest.mark.parametrize("lvl", range(10)) 2744s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2744s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2744s @pytest.mark.skipif( 2744s not PY311 and is_ci_environment() and is_platform_linux(), 2744s reason="Segfaulting in a CI environment" 2744s # with xfail, would sometimes raise UnicodeDecodeError 2744s # invalid state byte 2744s ) 2744s def test_complibs(tmp_path, lvl, lib, request): 2744s # GH14478 2744s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2744s request.applymarker( 2744s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2744s ) 2744s df = DataFrame( 2744s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2744s ) 2744s 2744s # Remove lzo if its not available on this platform 2744s if not tables.which_lib_version("lzo"): 2744s pytest.skip("lzo not available") 2744s # Remove bzip2 if its not available on this platform 2744s if not tables.which_lib_version("bzip2"): 2744s pytest.skip("bzip2 not available") 2744s 2744s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2744s gname = f"{lvl}_{lib}" 2744s 2744s # Write and read file to see if data is consistent 2744s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2744s result = read_hdf(tmpfile, gname) 2744s tm.assert_frame_equal(result, df) 2744s 2744s # Open file and check metadata for correct amount of compression 2744s with tables.open_file(tmpfile, mode="r") as h5table: 2744s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2744s assert node.filters.complevel == lvl 2744s if lvl == 0: 2744s assert node.filters.complib is None 2744s else: 2744s > assert node.filters.complib == lib 2744s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2744s E 2744s E - blosc2 2744s E + blosc2:blosclz 2744s 2744s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2744s ___________________________ test_complibs[blosc2-8] ____________________________ 2744s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-282/test_complibs_blosc2_8_0') 2744s lvl = 8, lib = 'blosc2' 2744s request = > 2744s 2744s @pytest.mark.parametrize("lvl", range(10)) 2744s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2744s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2744s @pytest.mark.skipif( 2744s not PY311 and is_ci_environment() and is_platform_linux(), 2744s reason="Segfaulting in a CI environment" 2744s # with xfail, would sometimes raise UnicodeDecodeError 2744s # invalid state byte 2744s ) 2744s def test_complibs(tmp_path, lvl, lib, request): 2744s # GH14478 2744s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2744s request.applymarker( 2744s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2744s ) 2744s df = DataFrame( 2744s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2744s ) 2744s 2744s # Remove lzo if its not available on this platform 2744s if not tables.which_lib_version("lzo"): 2744s pytest.skip("lzo not available") 2744s # Remove bzip2 if its not available on this platform 2744s if not tables.which_lib_version("bzip2"): 2744s pytest.skip("bzip2 not available") 2744s 2744s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2744s gname = f"{lvl}_{lib}" 2744s 2744s # Write and read file to see if data is consistent 2744s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2744s result = read_hdf(tmpfile, gname) 2744s tm.assert_frame_equal(result, df) 2744s 2744s # Open file and check metadata for correct amount of compression 2744s with tables.open_file(tmpfile, mode="r") as h5table: 2744s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2744s assert node.filters.complevel == lvl 2744s if lvl == 0: 2744s assert node.filters.complib is None 2744s else: 2744s > assert node.filters.complib == lib 2744s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2744s E 2744s E - blosc2 2744s E + blosc2:blosclz 2744s 2744s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2744s ___________________________ test_complibs[blosc2-9] ____________________________ 2744s tmp_path = PosixPath('/tmp/pytest-of-ubuntu/pytest-283/test_complibs_blosc2_9_0') 2744s lvl = 9, lib = 'blosc2' 2744s request = > 2744s 2744s @pytest.mark.parametrize("lvl", range(10)) 2744s @pytest.mark.parametrize("lib", tables.filters.all_complibs) 2744s @pytest.mark.filterwarnings("ignore:object name is not a valid") 2744s @pytest.mark.skipif( 2744s not PY311 and is_ci_environment() and is_platform_linux(), 2744s reason="Segfaulting in a CI environment" 2744s # with xfail, would sometimes raise UnicodeDecodeError 2744s # invalid state byte 2744s ) 2744s def test_complibs(tmp_path, lvl, lib, request): 2744s # GH14478 2744s if PY311 and is_platform_linux() and lib == "blosc2" and lvl != 0: 2744s request.applymarker( 2744s pytest.mark.xfail(reason=f"Fails for {lib} on Linux and PY > 3.11", strict=False) 2744s ) 2744s df = DataFrame( 2744s np.ones((30, 4)), columns=list("ABCD"), index=np.arange(30).astype(np.str_) 2744s ) 2744s 2744s # Remove lzo if its not available on this platform 2744s if not tables.which_lib_version("lzo"): 2744s pytest.skip("lzo not available") 2744s # Remove bzip2 if its not available on this platform 2744s if not tables.which_lib_version("bzip2"): 2744s pytest.skip("bzip2 not available") 2744s 2744s tmpfile = tmp_path / f"{lvl}_{lib}.h5" 2744s gname = f"{lvl}_{lib}" 2744s 2744s # Write and read file to see if data is consistent 2744s df.to_hdf(tmpfile, key=gname, complib=lib, complevel=lvl) 2744s result = read_hdf(tmpfile, gname) 2744s tm.assert_frame_equal(result, df) 2744s 2744s # Open file and check metadata for correct amount of compression 2744s with tables.open_file(tmpfile, mode="r") as h5table: 2744s for node in h5table.walk_nodes(where="/" + gname, classname="Leaf"): 2744s assert node.filters.complevel == lvl 2744s if lvl == 0: 2744s assert node.filters.complib is None 2744s else: 2744s > assert node.filters.complib == lib 2744s E AssertionError: assert 'blosc2:blosclz' == 'blosc2' 2744s E 2744s E - blosc2 2744s E + blosc2:blosclz 2744s 2744s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:301: AssertionError 2744s ______________________ test_append_frame_column_oriented _______________________ 2744s self = 2744s node = , kwargs = {'side': 'right'} 2744s value = 2744s slobj = slice(0, 4, None) 2744s 2744s def visit_Subscript(self, node, **kwargs) -> ops.Term: 2744s # only allow simple subscripts 2744s 2744s value = self.visit(node.value) 2744s slobj = self.visit(node.slice) 2744s try: 2744s value = value.value 2744s except AttributeError: 2744s pass 2744s 2744s if isinstance(slobj, Term): 2744s # In py39 np.ndarray lookups with Term containing int raise 2744s slobj = slobj.value 2744s 2744s try: 2744s > return self.const_type(value[slobj], self.env) 2744s E TypeError: 'builtin_function_or_method' object is not subscriptable 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/pytables.py:468: TypeError 2744s 2744s The above exception was the direct cause of the following exception: 2744s 2744s setup_path = 'tmp.__b73e48dc-9817-45be-8bfb-3170059905a8__.h5' 2744s 2744s @pytest.mark.xfail(condition=PY312 or is_crashing_arch, reason="https://bugs.debian.org/1055801 and https://bugs.debian.org/790925",raises=ValueError,strict=False, run=not is_crashing_arch) 2744s def test_append_frame_column_oriented(setup_path): 2744s with ensure_clean_store(setup_path) as store: 2744s # column oriented 2744s df = DataFrame( 2744s np.random.default_rng(2).standard_normal((10, 4)), 2744s columns=Index(list("ABCD"), dtype=object), 2744s index=date_range("2000-01-01", periods=10, freq="B"), 2744s ) 2744s df.index = df.index._with_freq(None) # freq doesn't round-trip 2744s 2744s _maybe_remove(store, "df1") 2744s store.append("df1", df.iloc[:, :2], axes=["columns"]) 2744s store.append("df1", df.iloc[:, 2:]) 2744s tm.assert_frame_equal(store["df1"], df) 2744s 2744s result = store.select("df1", "columns=A") 2744s expected = df.reindex(columns=["A"]) 2744s tm.assert_frame_equal(expected, result) 2744s 2744s # selection on the non-indexable 2744s > result = store.select("df1", ("columns=A", "index=df.index[0:4]")) 2744s 2744s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_append.py:311: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s File path: /tmp/tmpvwezfm39/tmp.__b73e48dc-9817-45be-8bfb-3170059905a8__.h5 2744s 2744s key = 'df1', where = [columns=A, index=df.index[0:4]], start = None, stop = None 2744s columns = None, iterator = False, chunksize = None, auto_close = False 2744s 2744s def select( 2744s self, 2744s key: str, 2744s where=None, 2744s start=None, 2744s stop=None, 2744s columns=None, 2744s iterator: bool = False, 2744s chunksize: int | None = None, 2744s auto_close: bool = False, 2744s ): 2744s """ 2744s Retrieve pandas object stored in file, optionally based on where criteria. 2744s 2744s .. warning:: 2744s 2744s Pandas uses PyTables for reading and writing HDF5 files, which allows 2744s serializing object-dtype data with pickle when using the "fixed" format. 2744s Loading pickled data received from untrusted sources can be unsafe. 2744s 2744s See: https://docs.python.org/3/library/pickle.html for more. 2744s 2744s Parameters 2744s ---------- 2744s key : str 2744s Object being retrieved from file. 2744s where : list or None 2744s List of Term (or convertible) objects, optional. 2744s start : int or None 2744s Row number to start selection. 2744s stop : int, default None 2744s Row number to stop selection. 2744s columns : list or None 2744s A list of columns that if not None, will limit the return columns. 2744s iterator : bool or False 2744s Returns an iterator. 2744s chunksize : int or None 2744s Number or rows to include in iteration, return an iterator. 2744s auto_close : bool or False 2744s Should automatically close the store when finished. 2744s 2744s Returns 2744s ------- 2744s object 2744s Retrieved object from file. 2744s 2744s Examples 2744s -------- 2744s >>> df = pd.DataFrame([[1, 2], [3, 4]], columns=['A', 'B']) 2744s >>> store = pd.HDFStore("store.h5", 'w') # doctest: +SKIP 2744s >>> store.put('data', df) # doctest: +SKIP 2744s >>> store.get('data') # doctest: +SKIP 2744s >>> print(store.keys()) # doctest: +SKIP 2744s ['/data1', '/data2'] 2744s >>> store.select('/data1') # doctest: +SKIP 2744s A B 2744s 0 1 2 2744s 1 3 4 2744s >>> store.select('/data1', where='columns == A') # doctest: +SKIP 2744s A 2744s 0 1 2744s 1 3 2744s >>> store.close() # doctest: +SKIP 2744s """ 2744s group = self.get_node(key) 2744s if group is None: 2744s raise KeyError(f"No object named {key} in the file") 2744s 2744s # create the storer and axes 2744s where = _ensure_term(where, scope_level=1) 2744s s = self._create_storer(group) 2744s s.infer_axes() 2744s 2744s # function to call on iteration 2744s def func(_start, _stop, _where): 2744s return s.read(start=_start, stop=_stop, where=_where, columns=columns) 2744s 2744s # create the iterator 2744s it = TableIterator( 2744s self, 2744s s, 2744s func, 2744s where=where, 2744s nrows=s.nrows, 2744s start=start, 2744s stop=stop, 2744s iterator=iterator, 2744s chunksize=chunksize, 2744s auto_close=auto_close, 2744s ) 2744s 2744s > return it.get_result() 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:915: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s coordinates = False 2744s 2744s def get_result(self, coordinates: bool = False): 2744s # return the actual iterator 2744s if self.chunksize is not None: 2744s if not isinstance(self.s, Table): 2744s raise TypeError("can only use an iterator or chunksize on a table") 2744s 2744s self.coordinates = self.s.read_coordinates(where=self.where) 2744s 2744s return self 2744s 2744s # if specified read via coordinates (necessary for multiple selections 2744s if coordinates: 2744s if not isinstance(self.s, Table): 2744s raise TypeError("can only read_coordinates on a table") 2744s where = self.s.read_coordinates( 2744s where=self.where, start=self.start, stop=self.stop 2744s ) 2744s else: 2744s where = self.where 2744s 2744s # directly return the result 2744s > results = self.func(self.start, self.stop, where) 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:2038: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s _start = 0, _stop = 4, _where = [columns=A, index=df.index[0:4]] 2744s 2744s def func(_start, _stop, _where): 2744s > return s.read(start=_start, stop=_stop, where=_where, columns=columns) 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:899: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = <[ClosedNodeError('the node object is closed') raised in repr()] AppendableFrameTable object at 0x7a110637da30> 2744s where = [columns=A, index=df.index[0:4]], columns = None, start = 0, stop = 4 2744s 2744s def read( 2744s self, 2744s where=None, 2744s columns=None, 2744s start: int | None = None, 2744s stop: int | None = None, 2744s ): 2744s # validate the version 2744s self.validate_version(where) 2744s 2744s # infer the data kind 2744s if not self.infer_axes(): 2744s return None 2744s 2744s > result = self._read_axes(where=where, start=start, stop=stop) 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:4640: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = <[ClosedNodeError('the node object is closed') raised in repr()] AppendableFrameTable object at 0x7a110637da30> 2744s where = [columns=A, index=df.index[0:4]], start = 0, stop = 4 2744s 2744s def _read_axes( 2744s self, where, start: int | None = None, stop: int | None = None 2744s ) -> list[tuple[np.ndarray, np.ndarray] | tuple[Index, Index]]: 2744s """ 2744s Create the axes sniffed from the table. 2744s 2744s Parameters 2744s ---------- 2744s where : ??? 2744s start : int or None, default None 2744s stop : int or None, default None 2744s 2744s Returns 2744s ------- 2744s List[Tuple[index_values, column_values]] 2744s """ 2744s # create the selection 2744s > selection = Selection(self, where=where, start=start, stop=stop) 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:3826: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s table = <[ClosedNodeError('the node object is closed') raised in repr()] AppendableFrameTable object at 0x7a110637da30> 2744s where = [columns=A, index=df.index[0:4]], start = 0, stop = 4 2744s 2744s def __init__( 2744s self, 2744s table: Table, 2744s where=None, 2744s start: int | None = None, 2744s stop: int | None = None, 2744s ) -> None: 2744s self.table = table 2744s self.where = where 2744s self.start = start 2744s self.stop = stop 2744s self.condition = None 2744s self.filter = None 2744s self.terms = None 2744s self.coordinates = None 2744s 2744s if is_list_like(where): 2744s # see if we have a passed coordinate like 2744s with suppress(ValueError): 2744s inferred = lib.infer_dtype(where, skipna=False) 2744s if inferred in ("integer", "boolean"): 2744s where = np.asarray(where) 2744s if where.dtype == np.bool_: 2744s start, stop = self.start, self.stop 2744s if start is None: 2744s start = 0 2744s if stop is None: 2744s stop = self.table.nrows 2744s self.coordinates = np.arange(start, stop)[where] 2744s elif issubclass(where.dtype.type, np.integer): 2744s if (self.start is not None and (where < self.start).any()) or ( 2744s self.stop is not None and (where >= self.stop).any() 2744s ): 2744s raise ValueError( 2744s "where must have index locations >= start and < stop" 2744s ) 2744s self.coordinates = where 2744s 2744s if self.coordinates is None: 2744s > self.terms = self.generate(where) 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:5367: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s where = [columns=A, index=df.index[0:4]] 2744s 2744s def generate(self, where): 2744s """where can be a : dict,list,tuple,string""" 2744s if where is None: 2744s return None 2744s 2744s q = self.table.queryables() 2744s try: 2744s > return PyTablesExpr(where, queryables=q, encoding=self.table.encoding) 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:5380: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = (columns=A) & (index=df.index[0:4]) 2744s where = [columns=A, index=df.index[0:4]] 2744s queryables = {'columns': name->columns,cname->columns,axis->1,pos->0,kind->string, 'index': None} 2744s encoding = 'UTF-8', scope_level = 0 2744s 2744s def __init__( 2744s self, 2744s where, 2744s queryables: dict[str, Any] | None = None, 2744s encoding=None, 2744s scope_level: int = 0, 2744s ) -> None: 2744s where = _validate_where(where) 2744s 2744s self.encoding = encoding 2744s self.condition = None 2744s self.filter = None 2744s self.terms = None 2744s self._visitor = None 2744s 2744s # capture the environment if needed 2744s local_dict: _scope.DeepChainMap[Any, Any] | None = None 2744s 2744s if isinstance(where, PyTablesExpr): 2744s local_dict = where.env.scope 2744s _where = where.expr 2744s 2744s elif is_list_like(where): 2744s where = list(where) 2744s for idx, w in enumerate(where): 2744s if isinstance(w, PyTablesExpr): 2744s local_dict = w.env.scope 2744s else: 2744s where[idx] = _validate_where(w) 2744s _where = " & ".join([f"({w})" for w in com.flatten(where)]) 2744s else: 2744s # _validate_where ensures we otherwise have a string 2744s _where = where 2744s 2744s self.expr = _where 2744s self.env = PyTablesScope(scope_level + 1, local_dict=local_dict) 2744s 2744s if queryables is not None and isinstance(self.expr, str): 2744s self.env.queryables.update(queryables) 2744s self._visitor = PyTablesExprVisitor( 2744s self.env, 2744s queryables=queryables, 2744s parser="pytables", 2744s engine="pytables", 2744s encoding=encoding, 2744s ) 2744s > self.terms = self.parse() 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/pytables.py:610: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = (columns=A) & (index=df.index[0:4]) 2744s 2744s def parse(self): 2744s """ 2744s Parse an expression. 2744s """ 2744s > return self._visitor.visit(self.expr) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:824: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s clean = '(columns ==A )and (index ==df .index [0 :4 ])', method = 'visit_Module' 2744s visitor = > 2744s 2744s def visit(self, node, **kwargs): 2744s if isinstance(node, str): 2744s clean = self.preparser(node) 2744s try: 2744s node = ast.fix_missing_locations(ast.parse(clean)) 2744s except SyntaxError as e: 2744s if any(iskeyword(x) for x in clean.split()): 2744s e.msg = "Python keyword not valid identifier in numexpr query" 2744s raise e 2744s 2744s method = f"visit_{type(node).__name__}" 2744s visitor = getattr(self, method) 2744s > return visitor(node, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s expr = 2744s 2744s def visit_Module(self, node, **kwargs): 2744s if len(node.body) != 1: 2744s raise SyntaxError("only a single expression is allowed") 2744s expr = node.body[0] 2744s > return self.visit(expr, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:417: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {}, method = 'visit_Expr' 2744s visitor = > 2744s 2744s def visit(self, node, **kwargs): 2744s if isinstance(node, str): 2744s clean = self.preparser(node) 2744s try: 2744s node = ast.fix_missing_locations(ast.parse(clean)) 2744s except SyntaxError as e: 2744s if any(iskeyword(x) for x in clean.split()): 2744s e.msg = "Python keyword not valid identifier in numexpr query" 2744s raise e 2744s 2744s method = f"visit_{type(node).__name__}" 2744s visitor = getattr(self, method) 2744s > return visitor(node, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s 2744s def visit_Expr(self, node, **kwargs): 2744s > return self.visit(node.value, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:420: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s method = 'visit_BoolOp' 2744s visitor = > 2744s 2744s def visit(self, node, **kwargs): 2744s if isinstance(node, str): 2744s clean = self.preparser(node) 2744s try: 2744s node = ast.fix_missing_locations(ast.parse(clean)) 2744s except SyntaxError as e: 2744s if any(iskeyword(x) for x in clean.split()): 2744s e.msg = "Python keyword not valid identifier in numexpr query" 2744s raise e 2744s 2744s method = f"visit_{type(node).__name__}" 2744s visitor = getattr(self, method) 2744s > return visitor(node, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s visitor = .visitor at 0x7a1104126b60> 2744s operands = [, ] 2744s 2744s def visit_BoolOp(self, node, **kwargs): 2744s def visitor(x, y): 2744s lhs = self._try_visit_binop(x) 2744s rhs = self._try_visit_binop(y) 2744s 2744s op, op_class, lhs, rhs = self._maybe_transform_eq_ne(node, lhs, rhs) 2744s return self._maybe_evaluate_binop(op, node.op, lhs, rhs) 2744s 2744s operands = node.values 2744s > return reduce(visitor, operands) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:742: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s x = 2744s y = 2744s 2744s def visitor(x, y): 2744s lhs = self._try_visit_binop(x) 2744s > rhs = self._try_visit_binop(y) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:736: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s bop = 2744s 2744s def _try_visit_binop(self, bop): 2744s if isinstance(bop, (Op, Term)): 2744s return bop 2744s > return self.visit(bop) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:731: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s method = 'visit_Compare' 2744s visitor = > 2744s 2744s def visit(self, node, **kwargs): 2744s if isinstance(node, str): 2744s clean = self.preparser(node) 2744s try: 2744s node = ast.fix_missing_locations(ast.parse(clean)) 2744s except SyntaxError as e: 2744s if any(iskeyword(x) for x in clean.split()): 2744s e.msg = "Python keyword not valid identifier in numexpr query" 2744s raise e 2744s 2744s method = f"visit_{type(node).__name__}" 2744s visitor = getattr(self, method) 2744s > return visitor(node, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s ops = [] 2744s comps = [] 2744s op = 2744s binop = 2744s 2744s def visit_Compare(self, node, **kwargs): 2744s ops = node.ops 2744s comps = node.comparators 2744s 2744s # base case: we have something like a CMP b 2744s if len(comps) == 1: 2744s op = self.translate_In(ops[0]) 2744s binop = ast.BinOp(op=op, left=node.left, right=comps[0]) 2744s > return self.visit(binop) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:715: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {}, method = 'visit_BinOp' 2744s visitor = > 2744s 2744s def visit(self, node, **kwargs): 2744s if isinstance(node, str): 2744s clean = self.preparser(node) 2744s try: 2744s node = ast.fix_missing_locations(ast.parse(clean)) 2744s except SyntaxError as e: 2744s if any(iskeyword(x) for x in clean.split()): 2744s e.msg = "Python keyword not valid identifier in numexpr query" 2744s raise e 2744s 2744s method = f"visit_{type(node).__name__}" 2744s visitor = getattr(self, method) 2744s > return visitor(node, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s 2744s def visit_BinOp(self, node, **kwargs): 2744s > op, op_class, left, right = self._maybe_transform_eq_ne(node) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:531: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , left = index, right = None 2744s 2744s def _maybe_transform_eq_ne(self, node, left=None, right=None): 2744s if left is None: 2744s left = self.visit(node.left, side="left") 2744s if right is None: 2744s > right = self.visit(node.right, side="right") 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:453: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {'side': 'right'} 2744s method = 'visit_Subscript' 2744s visitor = > 2744s 2744s def visit(self, node, **kwargs): 2744s if isinstance(node, str): 2744s clean = self.preparser(node) 2744s try: 2744s node = ast.fix_missing_locations(ast.parse(clean)) 2744s except SyntaxError as e: 2744s if any(iskeyword(x) for x in clean.split()): 2744s e.msg = "Python keyword not valid identifier in numexpr query" 2744s raise e 2744s 2744s method = f"visit_{type(node).__name__}" 2744s visitor = getattr(self, method) 2744s > return visitor(node, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {'side': 'right'} 2744s value = 2744s slobj = slice(0, 4, None) 2744s 2744s def visit_Subscript(self, node, **kwargs) -> ops.Term: 2744s # only allow simple subscripts 2744s 2744s value = self.visit(node.value) 2744s slobj = self.visit(node.slice) 2744s try: 2744s value = value.value 2744s except AttributeError: 2744s pass 2744s 2744s if isinstance(slobj, Term): 2744s # In py39 np.ndarray lookups with Term containing int raise 2744s slobj = slobj.value 2744s 2744s try: 2744s return self.const_type(value[slobj], self.env) 2744s except TypeError as err: 2744s > raise ValueError( 2744s f"cannot subscript {repr(value)} with {repr(slobj)}" 2744s ) from err 2744s E ValueError: cannot subscript with slice(0, 4, None) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/pytables.py:470: ValueError 2744s __________________________ test_select_filter_corner ___________________________ 2744s setup_path = 'tmp.__4b7b3888-f1f6-408d-8ed3-1cab7f38e47a__.h5' 2744s 2744s @pytest.mark.xfail(condition=PY312 or is_crashing_arch, reason="https://bugs.debian.org/1055801 and https://bugs.debian.org/790925",raises=ValueError,strict=False, run=not is_crashing_arch) 2744s def test_select_filter_corner(setup_path): 2744s df = DataFrame(np.random.default_rng(2).standard_normal((50, 100))) 2744s df.index = [f"{c:3d}" for c in df.index] 2744s df.columns = [f"{c:3d}" for c in df.columns] 2744s 2744s with ensure_clean_store(setup_path) as store: 2744s store.put("frame", df, format="table") 2744s 2744s crit = "columns=df.columns[:75]" 2744s > result = store.select("frame", [crit]) 2744s 2744s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_store.py:899: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s File path: /tmp/tmpbhkn54zp/tmp.__4b7b3888-f1f6-408d-8ed3-1cab7f38e47a__.h5 2744s 2744s key = 'frame', where = [columns=df.columns[:75]], start = None, stop = None 2744s columns = None, iterator = False, chunksize = None, auto_close = False 2744s 2744s def select( 2744s self, 2744s key: str, 2744s where=None, 2744s start=None, 2744s stop=None, 2744s columns=None, 2744s iterator: bool = False, 2744s chunksize: int | None = None, 2744s auto_close: bool = False, 2744s ): 2744s """ 2744s Retrieve pandas object stored in file, optionally based on where criteria. 2744s 2744s .. warning:: 2744s 2744s Pandas uses PyTables for reading and writing HDF5 files, which allows 2744s serializing object-dtype data with pickle when using the "fixed" format. 2744s Loading pickled data received from untrusted sources can be unsafe. 2744s 2744s See: https://docs.python.org/3/library/pickle.html for more. 2744s 2744s Parameters 2744s ---------- 2744s key : str 2744s Object being retrieved from file. 2744s where : list or None 2744s List of Term (or convertible) objects, optional. 2744s start : int or None 2744s Row number to start selection. 2744s stop : int, default None 2744s Row number to stop selection. 2744s columns : list or None 2744s A list of columns that if not None, will limit the return columns. 2744s iterator : bool or False 2744s Returns an iterator. 2744s chunksize : int or None 2744s Number or rows to include in iteration, return an iterator. 2744s auto_close : bool or False 2744s Should automatically close the store when finished. 2744s 2744s Returns 2744s ------- 2744s object 2744s Retrieved object from file. 2744s 2744s Examples 2744s -------- 2744s >>> df = pd.DataFrame([[1, 2], [3, 4]], columns=['A', 'B']) 2744s >>> store = pd.HDFStore("store.h5", 'w') # doctest: +SKIP 2744s >>> store.put('data', df) # doctest: +SKIP 2744s >>> store.get('data') # doctest: +SKIP 2744s >>> print(store.keys()) # doctest: +SKIP 2744s ['/data1', '/data2'] 2744s >>> store.select('/data1') # doctest: +SKIP 2744s A B 2744s 0 1 2 2744s 1 3 4 2744s >>> store.select('/data1', where='columns == A') # doctest: +SKIP 2744s A 2744s 0 1 2744s 1 3 2744s >>> store.close() # doctest: +SKIP 2744s """ 2744s group = self.get_node(key) 2744s if group is None: 2744s raise KeyError(f"No object named {key} in the file") 2744s 2744s # create the storer and axes 2744s where = _ensure_term(where, scope_level=1) 2744s s = self._create_storer(group) 2744s s.infer_axes() 2744s 2744s # function to call on iteration 2744s def func(_start, _stop, _where): 2744s return s.read(start=_start, stop=_stop, where=_where, columns=columns) 2744s 2744s # create the iterator 2744s it = TableIterator( 2744s self, 2744s s, 2744s func, 2744s where=where, 2744s nrows=s.nrows, 2744s start=start, 2744s stop=stop, 2744s iterator=iterator, 2744s chunksize=chunksize, 2744s auto_close=auto_close, 2744s ) 2744s 2744s > return it.get_result() 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:915: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s coordinates = False 2744s 2744s def get_result(self, coordinates: bool = False): 2744s # return the actual iterator 2744s if self.chunksize is not None: 2744s if not isinstance(self.s, Table): 2744s raise TypeError("can only use an iterator or chunksize on a table") 2744s 2744s self.coordinates = self.s.read_coordinates(where=self.where) 2744s 2744s return self 2744s 2744s # if specified read via coordinates (necessary for multiple selections 2744s if coordinates: 2744s if not isinstance(self.s, Table): 2744s raise TypeError("can only read_coordinates on a table") 2744s where = self.s.read_coordinates( 2744s where=self.where, start=self.start, stop=self.stop 2744s ) 2744s else: 2744s where = self.where 2744s 2744s # directly return the result 2744s > results = self.func(self.start, self.stop, where) 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:2038: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s _start = 0, _stop = 50, _where = [columns=df.columns[:75]] 2744s 2744s def func(_start, _stop, _where): 2744s > return s.read(start=_start, stop=_stop, where=_where, columns=columns) 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:899: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = <[ClosedNodeError('the node object is closed') raised in repr()] AppendableFrameTable object at 0x7a1104248a40> 2744s where = [columns=df.columns[:75]], columns = None, start = 0, stop = 50 2744s 2744s def read( 2744s self, 2744s where=None, 2744s columns=None, 2744s start: int | None = None, 2744s stop: int | None = None, 2744s ): 2744s # validate the version 2744s self.validate_version(where) 2744s 2744s # infer the data kind 2744s if not self.infer_axes(): 2744s return None 2744s 2744s > result = self._read_axes(where=where, start=start, stop=stop) 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:4640: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = <[ClosedNodeError('the node object is closed') raised in repr()] AppendableFrameTable object at 0x7a1104248a40> 2744s where = [columns=df.columns[:75]], start = 0, stop = 50 2744s 2744s def _read_axes( 2744s self, where, start: int | None = None, stop: int | None = None 2744s ) -> list[tuple[np.ndarray, np.ndarray] | tuple[Index, Index]]: 2744s """ 2744s Create the axes sniffed from the table. 2744s 2744s Parameters 2744s ---------- 2744s where : ??? 2744s start : int or None, default None 2744s stop : int or None, default None 2744s 2744s Returns 2744s ------- 2744s List[Tuple[index_values, column_values]] 2744s """ 2744s # create the selection 2744s > selection = Selection(self, where=where, start=start, stop=stop) 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:3826: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s table = <[ClosedNodeError('the node object is closed') raised in repr()] AppendableFrameTable object at 0x7a1104248a40> 2744s where = [columns=df.columns[:75]], start = 0, stop = 50 2744s 2744s def __init__( 2744s self, 2744s table: Table, 2744s where=None, 2744s start: int | None = None, 2744s stop: int | None = None, 2744s ) -> None: 2744s self.table = table 2744s self.where = where 2744s self.start = start 2744s self.stop = stop 2744s self.condition = None 2744s self.filter = None 2744s self.terms = None 2744s self.coordinates = None 2744s 2744s if is_list_like(where): 2744s # see if we have a passed coordinate like 2744s with suppress(ValueError): 2744s inferred = lib.infer_dtype(where, skipna=False) 2744s if inferred in ("integer", "boolean"): 2744s where = np.asarray(where) 2744s if where.dtype == np.bool_: 2744s start, stop = self.start, self.stop 2744s if start is None: 2744s start = 0 2744s if stop is None: 2744s stop = self.table.nrows 2744s self.coordinates = np.arange(start, stop)[where] 2744s elif issubclass(where.dtype.type, np.integer): 2744s if (self.start is not None and (where < self.start).any()) or ( 2744s self.stop is not None and (where >= self.stop).any() 2744s ): 2744s raise ValueError( 2744s "where must have index locations >= start and < stop" 2744s ) 2744s self.coordinates = where 2744s 2744s if self.coordinates is None: 2744s > self.terms = self.generate(where) 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:5367: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s where = [columns=df.columns[:75]] 2744s 2744s def generate(self, where): 2744s """where can be a : dict,list,tuple,string""" 2744s if where is None: 2744s return None 2744s 2744s q = self.table.queryables() 2744s try: 2744s > return PyTablesExpr(where, queryables=q, encoding=self.table.encoding) 2744s 2744s /usr/lib/python3/dist-packages/pandas/io/pytables.py:5380: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = (columns=df.columns[:75]), where = [columns=df.columns[:75]] 2744s queryables = {'columns': None, 'index': name->index,cname->index,axis->0,pos->0,kind->string} 2744s encoding = 'UTF-8', scope_level = 0 2744s 2744s def __init__( 2744s self, 2744s where, 2744s queryables: dict[str, Any] | None = None, 2744s encoding=None, 2744s scope_level: int = 0, 2744s ) -> None: 2744s where = _validate_where(where) 2744s 2744s self.encoding = encoding 2744s self.condition = None 2744s self.filter = None 2744s self.terms = None 2744s self._visitor = None 2744s 2744s # capture the environment if needed 2744s local_dict: _scope.DeepChainMap[Any, Any] | None = None 2744s 2744s if isinstance(where, PyTablesExpr): 2744s local_dict = where.env.scope 2744s _where = where.expr 2744s 2744s elif is_list_like(where): 2744s where = list(where) 2744s for idx, w in enumerate(where): 2744s if isinstance(w, PyTablesExpr): 2744s local_dict = w.env.scope 2744s else: 2744s where[idx] = _validate_where(w) 2744s _where = " & ".join([f"({w})" for w in com.flatten(where)]) 2744s else: 2744s # _validate_where ensures we otherwise have a string 2744s _where = where 2744s 2744s self.expr = _where 2744s self.env = PyTablesScope(scope_level + 1, local_dict=local_dict) 2744s 2744s if queryables is not None and isinstance(self.expr, str): 2744s self.env.queryables.update(queryables) 2744s self._visitor = PyTablesExprVisitor( 2744s self.env, 2744s queryables=queryables, 2744s parser="pytables", 2744s engine="pytables", 2744s encoding=encoding, 2744s ) 2744s > self.terms = self.parse() 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/pytables.py:610: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = (columns=df.columns[:75]) 2744s 2744s def parse(self): 2744s """ 2744s Parse an expression. 2744s """ 2744s > return self._visitor.visit(self.expr) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:824: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s clean = '(columns ==df .columns [:75 ])', method = 'visit_Module' 2744s visitor = > 2744s 2744s def visit(self, node, **kwargs): 2744s if isinstance(node, str): 2744s clean = self.preparser(node) 2744s try: 2744s node = ast.fix_missing_locations(ast.parse(clean)) 2744s except SyntaxError as e: 2744s if any(iskeyword(x) for x in clean.split()): 2744s e.msg = "Python keyword not valid identifier in numexpr query" 2744s raise e 2744s 2744s method = f"visit_{type(node).__name__}" 2744s visitor = getattr(self, method) 2744s > return visitor(node, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s expr = 2744s 2744s def visit_Module(self, node, **kwargs): 2744s if len(node.body) != 1: 2744s raise SyntaxError("only a single expression is allowed") 2744s expr = node.body[0] 2744s > return self.visit(expr, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:417: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {}, method = 'visit_Expr' 2744s visitor = > 2744s 2744s def visit(self, node, **kwargs): 2744s if isinstance(node, str): 2744s clean = self.preparser(node) 2744s try: 2744s node = ast.fix_missing_locations(ast.parse(clean)) 2744s except SyntaxError as e: 2744s if any(iskeyword(x) for x in clean.split()): 2744s e.msg = "Python keyword not valid identifier in numexpr query" 2744s raise e 2744s 2744s method = f"visit_{type(node).__name__}" 2744s visitor = getattr(self, method) 2744s > return visitor(node, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s 2744s def visit_Expr(self, node, **kwargs): 2744s > return self.visit(node.value, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:420: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s method = 'visit_Compare' 2744s visitor = > 2744s 2744s def visit(self, node, **kwargs): 2744s if isinstance(node, str): 2744s clean = self.preparser(node) 2744s try: 2744s node = ast.fix_missing_locations(ast.parse(clean)) 2744s except SyntaxError as e: 2744s if any(iskeyword(x) for x in clean.split()): 2744s e.msg = "Python keyword not valid identifier in numexpr query" 2744s raise e 2744s 2744s method = f"visit_{type(node).__name__}" 2744s visitor = getattr(self, method) 2744s > return visitor(node, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s ops = [] 2744s comps = [] 2744s op = 2744s binop = 2744s 2744s def visit_Compare(self, node, **kwargs): 2744s ops = node.ops 2744s comps = node.comparators 2744s 2744s # base case: we have something like a CMP b 2744s if len(comps) == 1: 2744s op = self.translate_In(ops[0]) 2744s binop = ast.BinOp(op=op, left=node.left, right=comps[0]) 2744s > return self.visit(binop) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:715: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {}, method = 'visit_BinOp' 2744s visitor = > 2744s 2744s def visit(self, node, **kwargs): 2744s if isinstance(node, str): 2744s clean = self.preparser(node) 2744s try: 2744s node = ast.fix_missing_locations(ast.parse(clean)) 2744s except SyntaxError as e: 2744s if any(iskeyword(x) for x in clean.split()): 2744s e.msg = "Python keyword not valid identifier in numexpr query" 2744s raise e 2744s 2744s method = f"visit_{type(node).__name__}" 2744s visitor = getattr(self, method) 2744s > return visitor(node, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s 2744s def visit_BinOp(self, node, **kwargs): 2744s > op, op_class, left, right = self._maybe_transform_eq_ne(node) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:531: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , left = columns, right = None 2744s 2744s def _maybe_transform_eq_ne(self, node, left=None, right=None): 2744s if left is None: 2744s left = self.visit(node.left, side="left") 2744s if right is None: 2744s > right = self.visit(node.right, side="right") 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:453: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {'side': 'right'} 2744s method = 'visit_Subscript' 2744s visitor = > 2744s 2744s def visit(self, node, **kwargs): 2744s if isinstance(node, str): 2744s clean = self.preparser(node) 2744s try: 2744s node = ast.fix_missing_locations(ast.parse(clean)) 2744s except SyntaxError as e: 2744s if any(iskeyword(x) for x in clean.split()): 2744s e.msg = "Python keyword not valid identifier in numexpr query" 2744s raise e 2744s 2744s method = f"visit_{type(node).__name__}" 2744s visitor = getattr(self, method) 2744s > return visitor(node, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {'side': 'right'} 2744s 2744s def visit_Subscript(self, node, **kwargs) -> ops.Term: 2744s # only allow simple subscripts 2744s 2744s > value = self.visit(node.value) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/pytables.py:456: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {} 2744s method = 'visit_Attribute' 2744s visitor = > 2744s 2744s def visit(self, node, **kwargs): 2744s if isinstance(node, str): 2744s clean = self.preparser(node) 2744s try: 2744s node = ast.fix_missing_locations(ast.parse(clean)) 2744s except SyntaxError as e: 2744s if any(iskeyword(x) for x in clean.split()): 2744s e.msg = "Python keyword not valid identifier in numexpr query" 2744s raise e 2744s 2744s method = f"visit_{type(node).__name__}" 2744s visitor = getattr(self, method) 2744s > return visitor(node, **kwargs) 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/expr.py:411: 2744s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 2744s 2744s self = 2744s node = , kwargs = {}, attr = 'columns' 2744s value = , ctx = 2744s resolved = 'df' 2744s 2744s def visit_Attribute(self, node, **kwargs): 2744s attr = node.attr 2744s value = node.value 2744s 2744s ctx = type(node.ctx) 2744s if ctx == ast.Load: 2744s # resolve the value 2744s resolved = self.visit(value) 2744s 2744s # try to get the value to see if we are another expression 2744s try: 2744s resolved = resolved.value 2744s except AttributeError: 2744s pass 2744s 2744s try: 2744s return self.term_type(getattr(resolved, attr), self.env) 2744s except AttributeError: 2744s # something like datetime.datetime where scope is overridden 2744s if isinstance(value, ast.Name) and value.id == attr: 2744s return resolved 2744s 2744s > raise ValueError(f"Invalid Attribute context {ctx.__name__}") 2744s E ValueError: Invalid Attribute context Load 2744s 2744s /usr/lib/python3/dist-packages/pandas/core/computation/pytables.py:496: ValueError 2744s =============================== warnings summary =============================== 2744s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:39 2744s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py:39: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2744s pytestmark = pytest.mark.single_cpu 2744s 2744s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_append.py:30 2744s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_append.py:30: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2744s pytestmark = pytest.mark.single_cpu 2744s 2744s ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_store.py:39 2744s /usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_store.py:39: PytestUnknownMarkWarning: Unknown pytest.mark.single_cpu - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 2744s pytestmark = pytest.mark.single_cpu 2744s 2744s test_file_handling.py: 191 warnings 2744s test_append.py: 21 warnings 2744s test_store.py: 66 warnings 2744s /usr/lib/python3/dist-packages/py/_process/forkedfunc.py:45: DeprecationWarning: This process (pid=33268) is multi-threaded, use of fork() may lead to deadlocks in the child. 2744s pid = os.fork() 2744s 2744s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475 2744s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:475: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/tests/io/pytables/.pytest_cache/v/cache/nodeids: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/tests/io/pytables/pytest-cache-files-spy6npff' 2744s config.cache.set("cache/nodeids", sorted(self.cached_nodeids)) 2744s 2744s ../../../usr/lib/python3/dist-packages/_pytest/cacheprovider.py:429 2744s /usr/lib/python3/dist-packages/_pytest/cacheprovider.py:429: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/tests/io/pytables/.pytest_cache/v/cache/lastfailed: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/tests/io/pytables/pytest-cache-files-uv8h28f3' 2744s config.cache.set("cache/lastfailed", self.lastfailed) 2744s 2744s ../../../usr/lib/python3/dist-packages/_pytest/stepwise.py:51 2744s /usr/lib/python3/dist-packages/_pytest/stepwise.py:51: PytestCacheWarning: could not create cache path /usr/lib/python3/dist-packages/pandas/tests/io/pytables/.pytest_cache/v/cache/stepwise: [Errno 13] Permission denied: '/usr/lib/python3/dist-packages/pandas/tests/io/pytables/pytest-cache-files-gjhx7c8j' 2744s session.config.cache.set(STEPWISE_CACHE_DIR, []) 2744s 2744s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 2744s =========================== short test summary info ============================ 2744s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-1] 2744s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-2] 2744s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-3] 2744s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-4] 2744s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-5] 2744s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-6] 2744s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-7] 2744s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-8] 2744s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_file_handling.py::test_complibs[blosc2-9] 2744s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_append.py::test_append_frame_column_oriented 2744s FAILED ../../../usr/lib/python3/dist-packages/pandas/tests/io/pytables/test_store.py::test_select_filter_corner 2744s ================ 11 failed, 267 passed, 284 warnings in 14.58s ================= 2744s pymysql/psycopg2 tests, which do not work in this test environment 2745s ============================= test session starts ============================== 2745s platform linux -- Python 3.12.7, pytest-8.3.3, pluggy-1.5.0 2745s PyQt5 5.15.11 -- Qt runtime 5.15.15 -- Qt compiled 5.15.15 2745s rootdir: /usr/lib/python3/dist-packages/pandas/tests 2745s plugins: forked-1.6.0, typeguard-4.4.1, localserver-0.0.0, asyncio-0.20.3, hypothesis-6.119.3, xdist-3.6.1, qt-4.3.1 2745s asyncio: mode=Mode.STRICT 2745s collected 2513 items 2745s 4087s ../../../usr/lib/python3/dist-packages/pandas/tests/io/test_sql.py FEFEFEFE....ssFEFEFEFE....ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE...FEFEFEFE....ssFEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...xssFEFEFEFEFEFEFEFExxxxFEFExxxxssFEFEFEFE...FEFEFEFE....ssFEFEFEFE....ssxxxxxxxx....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE...xssFEFEFEFE...xssFEFEFEFE...xssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE...xssFEFEFEFE...xssFEFEFEFE...xssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssxxxxFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE....ssFEFEFEFE...xssFEFEFEFE...xssFEFEFEFE...xss.FEFEFEFE...xssFEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE....EFE.EFE....ss.EFE.EFE....ssFEFEFEFE...FEFEFEFE...FEFEFEFE...xss.ss...FEFEFEFE..sFEFEFEFE..sFEFEFEFE..s.ssFEFEFEFE....ssFEFEFEFE...FEFEFEFE...FEFEFEFE...xxxxFEFExxxFEFEFEFE...FEFEFEFExxxFEFEFEFEFEFEFEFEFEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE...FEFEFEFE....ssFEautopkgtest-virt-ssh [01:30:08]: ------- nova console-log 02e937a9-4661-4e96-a4a7-354d31049823 (adt-plucky-i386-pandas-20241123-203106-juju-7f2275-prod-proposed-migration-environment-20-cde5351b-7499-48e1-9397-9f9ae29b1add) ------ 4087s ERROR (CommandError): No server with a name or ID of '02e937a9-4661-4e96-a4a7-354d31049823' exists. 4087s --------------------------------------------------- 4087s ------- nova show 02e937a9-4661-4e96-a4a7-354d31049823 (adt-plucky-i386-pandas-20241123-203106-juju-7f2275-prod-proposed-migration-environment-20-cde5351b-7499-48e1-9397-9f9ae29b1add) ------ 4087s ERROR (CommandError): No server with a name or ID of '02e937a9-4661-4e96-a4a7-354d31049823' exists. 4087s --------------------------------------------------- 4087s 4108s sudo: /var/tmp/autopkgtest-run-wrapper: command not found 4112s nova [W] Skipping flock for amd64 4112s Creating nova instance adt-plucky-i386-pandas-20241123-203106-juju-7f2275-prod-proposed-migration-environment-20-cde5351b-7499-48e1-9397-9f9ae29b1add from image adt/ubuntu-plucky-amd64-server-20241119.img (UUID 2e5306de-7efa-448c-bc27-5518979e66f0)... 4112s ERROR (CommandError): Unable to delete the specified server(s). 4112s autopkgtest [01:30:33]: ERROR: testbed failure: testbed auxverb failed with exit code 255